0s autopkgtest [10:24:08]: starting date and time: 2024-06-16 10:24:08+0000 0s autopkgtest [10:24:08]: git checkout: 433ed4cb Merge branch 'skia/nova_flock' into 'ubuntu/5.34+prod' 0s autopkgtest [10:24:08]: host juju-7f2275-prod-proposed-migration-environment-2; command line: /home/ubuntu/autopkgtest/runner/autopkgtest --output-dir /tmp/autopkgtest-work.m_kgtk28/out --timeout-copy=6000 --setup-commands /home/ubuntu/autopkgtest-cloud/worker-config-production/setup-canonical.sh --apt-pocket=proposed=src:traitlets --apt-upgrade jupyter-notebook --timeout-short=300 --timeout-copy=20000 --timeout-build=20000 --env=ADT_TEST_TRIGGERS=traitlets/5.14.3-1 -- ssh -s /home/ubuntu/autopkgtest/ssh-setup/nova -- --flavor autopkgtest --security-groups autopkgtest-juju-7f2275-prod-proposed-migration-environment-2@bos01-s390x-22.secgroup --name adt-oracular-s390x-jupyter-notebook-20240616-102408-juju-7f2275-prod-proposed-migration-environment-2-20636b6c-7120-4349-8d7d-28bda5c08416 --image adt/ubuntu-oracular-s390x-server --keyname testbed-juju-7f2275-prod-proposed-migration-environment-2 --net-id=net_prod-proposed-migration -e TERM=linux -e ''"'"'http_proxy=http://squid.internal:3128'"'"'' -e ''"'"'https_proxy=http://squid.internal:3128'"'"'' -e ''"'"'no_proxy=127.0.0.1,127.0.1.1,login.ubuntu.com,localhost,localdomain,novalocal,internal,archive.ubuntu.com,ports.ubuntu.com,security.ubuntu.com,ddebs.ubuntu.com,changelogs.ubuntu.com,keyserver.ubuntu.com,launchpadlibrarian.net,launchpadcontent.net,launchpad.net,10.24.0.0/24,keystone.ps5.canonical.com,objectstorage.prodstack5.canonical.com'"'"'' --mirror=http://us.ports.ubuntu.com/ubuntu-ports/ 117s autopkgtest [10:26:05]: testbed dpkg architecture: s390x 117s autopkgtest [10:26:05]: testbed apt version: 2.9.5 117s autopkgtest [10:26:05]: @@@@@@@@@@@@@@@@@@@@ test bed setup 118s Get:1 http://ftpmaster.internal/ubuntu oracular-proposed InRelease [110 kB] 118s Get:2 http://ftpmaster.internal/ubuntu oracular-proposed/universe Sources [389 kB] 118s Get:3 http://ftpmaster.internal/ubuntu oracular-proposed/main Sources [36.1 kB] 118s Get:4 http://ftpmaster.internal/ubuntu oracular-proposed/multiverse Sources [2576 B] 118s Get:5 http://ftpmaster.internal/ubuntu oracular-proposed/restricted Sources [7052 B] 118s Get:6 http://ftpmaster.internal/ubuntu oracular-proposed/main s390x Packages [43.9 kB] 118s Get:7 http://ftpmaster.internal/ubuntu oracular-proposed/restricted s390x Packages [1860 B] 118s Get:8 http://ftpmaster.internal/ubuntu oracular-proposed/universe s390x Packages [298 kB] 118s Get:9 http://ftpmaster.internal/ubuntu oracular-proposed/multiverse s390x Packages [2528 B] 119s Fetched 892 kB in 1s (1182 kB/s) 119s Reading package lists... 121s Reading package lists... 121s Building dependency tree... 121s Reading state information... 121s Calculating upgrade... 122s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 122s Reading package lists... 122s Building dependency tree... 122s Reading state information... 122s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 122s Hit:1 http://ftpmaster.internal/ubuntu oracular-proposed InRelease 122s Get:2 http://ftpmaster.internal/ubuntu oracular InRelease [110 kB] 123s Hit:3 http://ftpmaster.internal/ubuntu oracular-updates InRelease 123s Hit:4 http://ftpmaster.internal/ubuntu oracular-security InRelease 123s Get:5 http://ftpmaster.internal/ubuntu oracular/main Sources [1384 kB] 123s Get:6 http://ftpmaster.internal/ubuntu oracular/universe Sources [20.1 MB] 125s Get:7 http://ftpmaster.internal/ubuntu oracular/main s390x Packages [1336 kB] 125s Get:8 http://ftpmaster.internal/ubuntu oracular/universe s390x Packages [14.9 MB] 130s Fetched 37.8 MB in 8s (4783 kB/s) 131s Reading package lists... 131s Reading package lists... 131s Building dependency tree... 131s Reading state information... 131s Calculating upgrade... 131s The following packages will be upgraded: 131s libldap-common libldap2 132s 2 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 132s Need to get 230 kB of archives. 132s After this operation, 16.4 kB disk space will be freed. 132s Get:1 http://ftpmaster.internal/ubuntu oracular/main s390x libldap-common all 2.6.7+dfsg-1~exp1ubuntu9 [31.5 kB] 132s Get:2 http://ftpmaster.internal/ubuntu oracular/main s390x libldap2 s390x 2.6.7+dfsg-1~exp1ubuntu9 [199 kB] 132s Fetched 230 kB in 0s (554 kB/s) 132s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 54671 files and directories currently installed.) 132s Preparing to unpack .../libldap-common_2.6.7+dfsg-1~exp1ubuntu9_all.deb ... 132s Unpacking libldap-common (2.6.7+dfsg-1~exp1ubuntu9) over (2.6.7+dfsg-1~exp1ubuntu8) ... 132s Preparing to unpack .../libldap2_2.6.7+dfsg-1~exp1ubuntu9_s390x.deb ... 132s Unpacking libldap2:s390x (2.6.7+dfsg-1~exp1ubuntu9) over (2.6.7+dfsg-1~exp1ubuntu8) ... 132s Setting up libldap-common (2.6.7+dfsg-1~exp1ubuntu9) ... 132s Setting up libldap2:s390x (2.6.7+dfsg-1~exp1ubuntu9) ... 132s Processing triggers for man-db (2.12.1-2) ... 133s Processing triggers for libc-bin (2.39-0ubuntu9) ... 133s Reading package lists... 133s Building dependency tree... 133s Reading state information... 133s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 136s autopkgtest [10:26:24]: testbed running kernel: Linux 6.8.0-31-generic #31-Ubuntu SMP Sat Apr 20 00:14:26 UTC 2024 136s autopkgtest [10:26:24]: @@@@@@@@@@@@@@@@@@@@ apt-source jupyter-notebook 139s Get:1 http://ftpmaster.internal/ubuntu oracular/universe jupyter-notebook 6.4.12-2.2ubuntu1 (dsc) [3886 B] 139s Get:2 http://ftpmaster.internal/ubuntu oracular/universe jupyter-notebook 6.4.12-2.2ubuntu1 (tar) [8501 kB] 139s Get:3 http://ftpmaster.internal/ubuntu oracular/universe jupyter-notebook 6.4.12-2.2ubuntu1 (diff) [49.6 kB] 139s gpgv: Signature made Thu Feb 15 18:11:52 2024 UTC 139s gpgv: using RSA key D09F8A854F1055BCFC482C4B23566B906047AFC8 139s gpgv: Can't check signature: No public key 139s dpkg-source: warning: cannot verify inline signature for ./jupyter-notebook_6.4.12-2.2ubuntu1.dsc: no acceptable signature found 140s autopkgtest [10:26:28]: testing package jupyter-notebook version 6.4.12-2.2ubuntu1 140s autopkgtest [10:26:28]: build not needed 141s autopkgtest [10:26:29]: test pytest: preparing testbed 142s Reading package lists... 142s Building dependency tree... 142s Reading state information... 142s Starting pkgProblemResolver with broken count: 0 142s Starting 2 pkgProblemResolver with broken count: 0 142s Done 143s The following additional packages will be installed: 143s fonts-font-awesome fonts-glyphicons-halflings fonts-lato fonts-mathjax gdb 143s jupyter-core jupyter-notebook libbabeltrace1 libdebuginfod-common 143s libdebuginfod1t64 libjs-backbone libjs-bootstrap libjs-bootstrap-tour 143s libjs-codemirror libjs-es6-promise libjs-jed libjs-jquery 143s libjs-jquery-typeahead libjs-jquery-ui libjs-marked libjs-mathjax 143s libjs-moment libjs-requirejs libjs-requirejs-text libjs-sphinxdoc 143s libjs-text-encoding libjs-underscore libjs-xterm libnorm1t64 libpgm-5.3-0t64 143s libpython3.12t64 libsodium23 libsource-highlight-common 143s libsource-highlight4t64 libzmq5 node-jed python-notebook-doc 143s python-tinycss2-common python3-argon2 python3-asttokens python3-bleach 143s python3-bs4 python3-bytecode python3-comm python3-coverage python3-dateutil 143s python3-debugpy python3-decorator python3-defusedxml python3-entrypoints 143s python3-executing python3-fastjsonschema python3-html5lib python3-iniconfig 143s python3-ipykernel python3-ipython python3-ipython-genutils python3-jedi 143s python3-jupyter-client python3-jupyter-core python3-jupyterlab-pygments 143s python3-matplotlib-inline python3-mistune python3-nbclient python3-nbconvert 143s python3-nbformat python3-nest-asyncio python3-notebook python3-packaging 143s python3-pandocfilters python3-parso python3-pexpect python3-platformdirs 143s python3-pluggy python3-prometheus-client python3-prompt-toolkit 143s python3-psutil python3-ptyprocess python3-pure-eval python3-py 143s python3-pydevd python3-pytest python3-requests-unixsocket python3-send2trash 143s python3-soupsieve python3-stack-data python3-terminado python3-tinycss2 143s python3-tornado python3-traitlets python3-typeshed python3-wcwidth 143s python3-webencodings python3-zmq sphinx-rtd-theme-common 143s Suggested packages: 143s gdb-doc gdbserver libjs-jquery-lazyload libjs-json libjs-jquery-ui-docs 143s fonts-mathjax-extras fonts-stix libjs-mathjax-doc python-argon2-doc 143s python-bleach-doc python-bytecode-doc python-coverage-doc 143s python-fastjsonschema-doc python3-genshi python3-lxml python-ipython-doc 143s python3-pip python-nbconvert-doc texlive-fonts-recommended 143s texlive-plain-generic texlive-xetex python-pexpect-doc subversion pydevd 143s python-terminado-doc python-tinycss2-doc python3-pycurl python-tornado-doc 143s python3-twisted 143s Recommended packages: 143s libc-dbg javascript-common python3-lxml python3-matplotlib pandoc 143s python3-ipywidgets 143s The following NEW packages will be installed: 143s autopkgtest-satdep fonts-font-awesome fonts-glyphicons-halflings fonts-lato 143s fonts-mathjax gdb jupyter-core jupyter-notebook libbabeltrace1 143s libdebuginfod-common libdebuginfod1t64 libjs-backbone libjs-bootstrap 143s libjs-bootstrap-tour libjs-codemirror libjs-es6-promise libjs-jed 143s libjs-jquery libjs-jquery-typeahead libjs-jquery-ui libjs-marked 143s libjs-mathjax libjs-moment libjs-requirejs libjs-requirejs-text 143s libjs-sphinxdoc libjs-text-encoding libjs-underscore libjs-xterm libnorm1t64 143s libpgm-5.3-0t64 libpython3.12t64 libsodium23 libsource-highlight-common 143s libsource-highlight4t64 libzmq5 node-jed python-notebook-doc 143s python-tinycss2-common python3-argon2 python3-asttokens python3-bleach 143s python3-bs4 python3-bytecode python3-comm python3-coverage python3-dateutil 143s python3-debugpy python3-decorator python3-defusedxml python3-entrypoints 143s python3-executing python3-fastjsonschema python3-html5lib python3-iniconfig 143s python3-ipykernel python3-ipython python3-ipython-genutils python3-jedi 143s python3-jupyter-client python3-jupyter-core python3-jupyterlab-pygments 143s python3-matplotlib-inline python3-mistune python3-nbclient python3-nbconvert 143s python3-nbformat python3-nest-asyncio python3-notebook python3-packaging 143s python3-pandocfilters python3-parso python3-pexpect python3-platformdirs 143s python3-pluggy python3-prometheus-client python3-prompt-toolkit 143s python3-psutil python3-ptyprocess python3-pure-eval python3-py 143s python3-pydevd python3-pytest python3-requests-unixsocket python3-send2trash 143s python3-soupsieve python3-stack-data python3-terminado python3-tinycss2 143s python3-tornado python3-traitlets python3-typeshed python3-wcwidth 143s python3-webencodings python3-zmq sphinx-rtd-theme-common 143s 0 upgraded, 96 newly installed, 0 to remove and 0 not upgraded. 143s Need to get 33.5 MB/33.5 MB of archives. 143s After this operation, 170 MB of additional disk space will be used. 143s Get:1 /tmp/autopkgtest.s4beMp/1-autopkgtest-satdep.deb autopkgtest-satdep s390x 0 [752 B] 143s Get:2 http://ftpmaster.internal/ubuntu oracular/main s390x fonts-lato all 2.015-1 [2781 kB] 145s Get:3 http://ftpmaster.internal/ubuntu oracular/main s390x libdebuginfod-common all 0.191-1 [14.6 kB] 145s Get:4 http://ftpmaster.internal/ubuntu oracular/main s390x fonts-font-awesome all 5.0.10+really4.7.0~dfsg-4.1 [516 kB] 145s Get:5 http://ftpmaster.internal/ubuntu oracular/universe s390x fonts-glyphicons-halflings all 1.009~3.4.1+dfsg-3 [118 kB] 145s Get:6 http://ftpmaster.internal/ubuntu oracular/main s390x fonts-mathjax all 2.7.9+dfsg-1 [2208 kB] 146s Get:7 http://ftpmaster.internal/ubuntu oracular/main s390x libbabeltrace1 s390x 1.5.11-3build3 [173 kB] 146s Get:8 http://ftpmaster.internal/ubuntu oracular/main s390x libdebuginfod1t64 s390x 0.191-1 [17.6 kB] 146s Get:9 http://ftpmaster.internal/ubuntu oracular/main s390x libpython3.12t64 s390x 3.12.4-1 [2507 kB] 146s Get:10 http://ftpmaster.internal/ubuntu oracular/main s390x libsource-highlight-common all 3.1.9-4.3build1 [64.2 kB] 146s Get:11 http://ftpmaster.internal/ubuntu oracular/main s390x libsource-highlight4t64 s390x 3.1.9-4.3build1 [268 kB] 146s Get:12 http://ftpmaster.internal/ubuntu oracular/main s390x gdb s390x 15.0.50.20240403-0ubuntu1 [3899 kB] 147s Get:13 http://ftpmaster.internal/ubuntu oracular/main s390x python3-platformdirs all 4.2.1-1 [16.3 kB] 147s Get:14 http://ftpmaster.internal/ubuntu oracular-proposed/universe s390x python3-traitlets all 5.14.3-1 [71.3 kB] 147s Get:15 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-jupyter-core all 5.3.2-2 [25.5 kB] 147s Get:16 http://ftpmaster.internal/ubuntu oracular/universe s390x jupyter-core all 5.3.2-2 [4038 B] 147s Get:17 http://ftpmaster.internal/ubuntu oracular/main s390x libjs-underscore all 1.13.4~dfsg+~1.11.4-3 [118 kB] 147s Get:18 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-backbone all 1.4.1~dfsg+~1.4.15-3 [185 kB] 147s Get:19 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-bootstrap all 3.4.1+dfsg-3 [129 kB] 147s Get:20 http://ftpmaster.internal/ubuntu oracular/main s390x libjs-jquery all 3.6.1+dfsg+~3.5.14-1 [328 kB] 147s Get:21 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-bootstrap-tour all 0.12.0+dfsg-5 [21.4 kB] 147s Get:22 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-codemirror all 5.65.0+~cs5.83.9-3 [755 kB] 147s Get:23 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-es6-promise all 4.2.8-12 [14.1 kB] 147s Get:24 http://ftpmaster.internal/ubuntu oracular/universe s390x node-jed all 1.1.1-4 [15.2 kB] 147s Get:25 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-jed all 1.1.1-4 [2584 B] 147s Get:26 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-jquery-typeahead all 2.11.0+dfsg1-3 [48.9 kB] 147s Get:27 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-jquery-ui all 1.13.2+dfsg-1 [252 kB] 147s Get:28 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-marked all 4.2.3+ds+~4.0.7-3 [36.2 kB] 147s Get:29 http://ftpmaster.internal/ubuntu oracular/main s390x libjs-mathjax all 2.7.9+dfsg-1 [5665 kB] 148s Get:30 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-moment all 2.29.4+ds-1 [147 kB] 148s Get:31 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-requirejs all 2.3.6+ds+~2.1.37-1 [201 kB] 148s Get:32 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-requirejs-text all 2.0.12-1.1 [9056 B] 148s Get:33 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-text-encoding all 0.7.0-5 [140 kB] 149s Get:34 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-xterm all 5.3.0-2 [476 kB] 149s Get:35 http://ftpmaster.internal/ubuntu oracular/main s390x python3-ptyprocess all 0.7.0-5 [15.1 kB] 149s Get:36 http://ftpmaster.internal/ubuntu oracular/main s390x python3-tornado s390x 6.4.1-1 [298 kB] 149s Get:37 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-terminado all 0.18.1-1 [13.2 kB] 149s Get:38 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-argon2 s390x 21.1.0-2build1 [21.2 kB] 149s Get:39 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-comm all 0.2.1-1 [7016 B] 149s Get:40 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-bytecode all 0.15.1-3 [44.7 kB] 149s Get:41 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-coverage s390x 7.4.4+dfsg1-0ubuntu2 [147 kB] 149s Get:42 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-pydevd s390x 2.10.0+ds-10ubuntu1 [638 kB] 149s Get:43 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-debugpy all 1.8.0+ds-4ubuntu4 [67.6 kB] 149s Get:44 http://ftpmaster.internal/ubuntu oracular/main s390x python3-decorator all 5.1.1-5 [10.1 kB] 149s Get:45 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-parso all 0.8.3-1 [67.2 kB] 149s Get:46 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-typeshed all 0.0~git20231111.6764465-3 [1274 kB] 149s Get:47 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-jedi all 0.19.1+ds1-1 [693 kB] 149s Get:48 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-matplotlib-inline all 0.1.6-2 [8784 B] 149s Get:49 http://ftpmaster.internal/ubuntu oracular/main s390x python3-pexpect all 4.9-2 [48.1 kB] 149s Get:50 http://ftpmaster.internal/ubuntu oracular/main s390x python3-wcwidth all 0.2.5+dfsg1-1.1ubuntu1 [22.5 kB] 149s Get:51 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-prompt-toolkit all 3.0.46-1 [256 kB] 149s Get:52 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-asttokens all 2.4.1-1 [20.9 kB] 149s Get:53 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-executing all 2.0.1-0.1 [23.3 kB] 149s Get:54 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-pure-eval all 0.2.2-2 [11.1 kB] 149s Get:55 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-stack-data all 0.6.3-1 [22.0 kB] 149s Get:56 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-ipython all 8.20.0-1ubuntu1 [561 kB] 149s Get:57 http://ftpmaster.internal/ubuntu oracular/main s390x python3-dateutil all 2.9.0-2 [80.3 kB] 149s Get:58 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-entrypoints all 0.4-2 [7146 B] 149s Get:59 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-nest-asyncio all 1.5.4-1 [6256 B] 149s Get:60 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-py all 1.11.0-2 [72.7 kB] 149s Get:61 http://ftpmaster.internal/ubuntu oracular/universe s390x libnorm1t64 s390x 1.5.9+dfsg-3.1build1 [158 kB] 149s Get:62 http://ftpmaster.internal/ubuntu oracular/universe s390x libpgm-5.3-0t64 s390x 5.3.128~dfsg-2.1build1 [169 kB] 149s Get:63 http://ftpmaster.internal/ubuntu oracular/main s390x libsodium23 s390x 1.0.18-1build3 [138 kB] 149s Get:64 http://ftpmaster.internal/ubuntu oracular/universe s390x libzmq5 s390x 4.3.5-1build2 [258 kB] 149s Get:65 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-zmq s390x 24.0.1-5build1 [298 kB] 150s Get:66 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-jupyter-client all 7.4.9-2ubuntu1 [90.5 kB] 150s Get:67 http://ftpmaster.internal/ubuntu oracular/main s390x python3-packaging all 24.0-1 [41.1 kB] 150s Get:68 http://ftpmaster.internal/ubuntu oracular/main s390x python3-psutil s390x 5.9.8-2build2 [195 kB] 150s Get:69 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-ipykernel all 6.29.3-1ubuntu1 [82.6 kB] 150s Get:70 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-ipython-genutils all 0.2.0-6 [22.0 kB] 150s Get:71 http://ftpmaster.internal/ubuntu oracular/main s390x python3-webencodings all 0.5.1-5 [11.5 kB] 150s Get:72 http://ftpmaster.internal/ubuntu oracular/main s390x python3-html5lib all 1.1-6 [88.8 kB] 150s Get:73 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-bleach all 6.1.0-2 [49.6 kB] 150s Get:74 http://ftpmaster.internal/ubuntu oracular/main s390x python3-soupsieve all 2.5-1 [33.0 kB] 150s Get:75 http://ftpmaster.internal/ubuntu oracular/main s390x python3-bs4 all 4.12.3-1 [109 kB] 150s Get:76 http://ftpmaster.internal/ubuntu oracular/main s390x python3-defusedxml all 0.7.1-2 [42.0 kB] 150s Get:77 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-jupyterlab-pygments all 0.2.2-3 [6054 B] 150s Get:78 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-mistune all 3.0.2-1 [32.8 kB] 150s Get:79 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-fastjsonschema all 2.19.1-1 [19.7 kB] 150s Get:80 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-nbformat all 5.9.1-1 [41.2 kB] 150s Get:81 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-nbclient all 0.8.0-1 [55.6 kB] 150s Get:82 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-pandocfilters all 1.5.1-1 [23.6 kB] 150s Get:83 http://ftpmaster.internal/ubuntu oracular/universe s390x python-tinycss2-common all 1.3.0-1 [34.1 kB] 150s Get:84 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-tinycss2 all 1.3.0-1 [19.6 kB] 150s Get:85 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-nbconvert all 7.16.4-1 [156 kB] 150s Get:86 http://ftpmaster.internal/ubuntu oracular/main s390x python3-prometheus-client all 0.19.0+ds1-1 [41.7 kB] 150s Get:87 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-send2trash all 1.8.2-1 [15.5 kB] 150s Get:88 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-notebook all 6.4.12-2.2ubuntu1 [1566 kB] 150s Get:89 http://ftpmaster.internal/ubuntu oracular/universe s390x jupyter-notebook all 6.4.12-2.2ubuntu1 [10.4 kB] 150s Get:90 http://ftpmaster.internal/ubuntu oracular/main s390x libjs-sphinxdoc all 7.2.6-8 [150 kB] 150s Get:91 http://ftpmaster.internal/ubuntu oracular/main s390x sphinx-rtd-theme-common all 2.0.0+dfsg-1 [1012 kB] 150s Get:92 http://ftpmaster.internal/ubuntu oracular/universe s390x python-notebook-doc all 6.4.12-2.2ubuntu1 [2540 kB] 151s Get:93 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-iniconfig all 1.1.1-2 [6024 B] 151s Get:94 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-pluggy all 1.5.0-1 [21.0 kB] 151s Get:95 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-pytest all 7.4.4-1 [305 kB] 151s Get:96 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-requests-unixsocket all 0.3.0-4 [7274 B] 151s Preconfiguring packages ... 151s Fetched 33.5 MB in 8s (4173 kB/s) 151s Selecting previously unselected package fonts-lato. 151s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 54671 files and directories currently installed.) 151s Preparing to unpack .../00-fonts-lato_2.015-1_all.deb ... 151s Unpacking fonts-lato (2.015-1) ... 152s Selecting previously unselected package libdebuginfod-common. 152s Preparing to unpack .../01-libdebuginfod-common_0.191-1_all.deb ... 152s Unpacking libdebuginfod-common (0.191-1) ... 152s Selecting previously unselected package fonts-font-awesome. 152s Preparing to unpack .../02-fonts-font-awesome_5.0.10+really4.7.0~dfsg-4.1_all.deb ... 152s Unpacking fonts-font-awesome (5.0.10+really4.7.0~dfsg-4.1) ... 152s Selecting previously unselected package fonts-glyphicons-halflings. 152s Preparing to unpack .../03-fonts-glyphicons-halflings_1.009~3.4.1+dfsg-3_all.deb ... 152s Unpacking fonts-glyphicons-halflings (1.009~3.4.1+dfsg-3) ... 152s Selecting previously unselected package fonts-mathjax. 152s Preparing to unpack .../04-fonts-mathjax_2.7.9+dfsg-1_all.deb ... 152s Unpacking fonts-mathjax (2.7.9+dfsg-1) ... 152s Selecting previously unselected package libbabeltrace1:s390x. 152s Preparing to unpack .../05-libbabeltrace1_1.5.11-3build3_s390x.deb ... 152s Unpacking libbabeltrace1:s390x (1.5.11-3build3) ... 152s Selecting previously unselected package libdebuginfod1t64:s390x. 152s Preparing to unpack .../06-libdebuginfod1t64_0.191-1_s390x.deb ... 152s Unpacking libdebuginfod1t64:s390x (0.191-1) ... 152s Selecting previously unselected package libpython3.12t64:s390x. 152s Preparing to unpack .../07-libpython3.12t64_3.12.4-1_s390x.deb ... 152s Unpacking libpython3.12t64:s390x (3.12.4-1) ... 152s Selecting previously unselected package libsource-highlight-common. 152s Preparing to unpack .../08-libsource-highlight-common_3.1.9-4.3build1_all.deb ... 152s Unpacking libsource-highlight-common (3.1.9-4.3build1) ... 152s Selecting previously unselected package libsource-highlight4t64:s390x. 152s Preparing to unpack .../09-libsource-highlight4t64_3.1.9-4.3build1_s390x.deb ... 152s Unpacking libsource-highlight4t64:s390x (3.1.9-4.3build1) ... 152s Selecting previously unselected package gdb. 152s Preparing to unpack .../10-gdb_15.0.50.20240403-0ubuntu1_s390x.deb ... 152s Unpacking gdb (15.0.50.20240403-0ubuntu1) ... 152s Selecting previously unselected package python3-platformdirs. 152s Preparing to unpack .../11-python3-platformdirs_4.2.1-1_all.deb ... 152s Unpacking python3-platformdirs (4.2.1-1) ... 152s Selecting previously unselected package python3-traitlets. 152s Preparing to unpack .../12-python3-traitlets_5.14.3-1_all.deb ... 152s Unpacking python3-traitlets (5.14.3-1) ... 152s Selecting previously unselected package python3-jupyter-core. 152s Preparing to unpack .../13-python3-jupyter-core_5.3.2-2_all.deb ... 152s Unpacking python3-jupyter-core (5.3.2-2) ... 152s Selecting previously unselected package jupyter-core. 152s Preparing to unpack .../14-jupyter-core_5.3.2-2_all.deb ... 152s Unpacking jupyter-core (5.3.2-2) ... 152s Selecting previously unselected package libjs-underscore. 152s Preparing to unpack .../15-libjs-underscore_1.13.4~dfsg+~1.11.4-3_all.deb ... 152s Unpacking libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... 152s Selecting previously unselected package libjs-backbone. 152s Preparing to unpack .../16-libjs-backbone_1.4.1~dfsg+~1.4.15-3_all.deb ... 152s Unpacking libjs-backbone (1.4.1~dfsg+~1.4.15-3) ... 152s Selecting previously unselected package libjs-bootstrap. 152s Preparing to unpack .../17-libjs-bootstrap_3.4.1+dfsg-3_all.deb ... 152s Unpacking libjs-bootstrap (3.4.1+dfsg-3) ... 152s Selecting previously unselected package libjs-jquery. 152s Preparing to unpack .../18-libjs-jquery_3.6.1+dfsg+~3.5.14-1_all.deb ... 152s Unpacking libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 152s Selecting previously unselected package libjs-bootstrap-tour. 152s Preparing to unpack .../19-libjs-bootstrap-tour_0.12.0+dfsg-5_all.deb ... 152s Unpacking libjs-bootstrap-tour (0.12.0+dfsg-5) ... 152s Selecting previously unselected package libjs-codemirror. 152s Preparing to unpack .../20-libjs-codemirror_5.65.0+~cs5.83.9-3_all.deb ... 152s Unpacking libjs-codemirror (5.65.0+~cs5.83.9-3) ... 153s Selecting previously unselected package libjs-es6-promise. 153s Preparing to unpack .../21-libjs-es6-promise_4.2.8-12_all.deb ... 153s Unpacking libjs-es6-promise (4.2.8-12) ... 153s Selecting previously unselected package node-jed. 153s Preparing to unpack .../22-node-jed_1.1.1-4_all.deb ... 153s Unpacking node-jed (1.1.1-4) ... 153s Selecting previously unselected package libjs-jed. 153s Preparing to unpack .../23-libjs-jed_1.1.1-4_all.deb ... 153s Unpacking libjs-jed (1.1.1-4) ... 153s Selecting previously unselected package libjs-jquery-typeahead. 153s Preparing to unpack .../24-libjs-jquery-typeahead_2.11.0+dfsg1-3_all.deb ... 153s Unpacking libjs-jquery-typeahead (2.11.0+dfsg1-3) ... 153s Selecting previously unselected package libjs-jquery-ui. 153s Preparing to unpack .../25-libjs-jquery-ui_1.13.2+dfsg-1_all.deb ... 153s Unpacking libjs-jquery-ui (1.13.2+dfsg-1) ... 153s Selecting previously unselected package libjs-marked. 153s Preparing to unpack .../26-libjs-marked_4.2.3+ds+~4.0.7-3_all.deb ... 153s Unpacking libjs-marked (4.2.3+ds+~4.0.7-3) ... 153s Selecting previously unselected package libjs-mathjax. 153s Preparing to unpack .../27-libjs-mathjax_2.7.9+dfsg-1_all.deb ... 153s Unpacking libjs-mathjax (2.7.9+dfsg-1) ... 154s Selecting previously unselected package libjs-moment. 154s Preparing to unpack .../28-libjs-moment_2.29.4+ds-1_all.deb ... 154s Unpacking libjs-moment (2.29.4+ds-1) ... 154s Selecting previously unselected package libjs-requirejs. 154s Preparing to unpack .../29-libjs-requirejs_2.3.6+ds+~2.1.37-1_all.deb ... 154s Unpacking libjs-requirejs (2.3.6+ds+~2.1.37-1) ... 154s Selecting previously unselected package libjs-requirejs-text. 154s Preparing to unpack .../30-libjs-requirejs-text_2.0.12-1.1_all.deb ... 154s Unpacking libjs-requirejs-text (2.0.12-1.1) ... 154s Selecting previously unselected package libjs-text-encoding. 154s Preparing to unpack .../31-libjs-text-encoding_0.7.0-5_all.deb ... 154s Unpacking libjs-text-encoding (0.7.0-5) ... 154s Selecting previously unselected package libjs-xterm. 154s Preparing to unpack .../32-libjs-xterm_5.3.0-2_all.deb ... 154s Unpacking libjs-xterm (5.3.0-2) ... 154s Selecting previously unselected package python3-ptyprocess. 154s Preparing to unpack .../33-python3-ptyprocess_0.7.0-5_all.deb ... 154s Unpacking python3-ptyprocess (0.7.0-5) ... 154s Selecting previously unselected package python3-tornado. 154s Preparing to unpack .../34-python3-tornado_6.4.1-1_s390x.deb ... 154s Unpacking python3-tornado (6.4.1-1) ... 154s Selecting previously unselected package python3-terminado. 154s Preparing to unpack .../35-python3-terminado_0.18.1-1_all.deb ... 154s Unpacking python3-terminado (0.18.1-1) ... 154s Selecting previously unselected package python3-argon2. 154s Preparing to unpack .../36-python3-argon2_21.1.0-2build1_s390x.deb ... 154s Unpacking python3-argon2 (21.1.0-2build1) ... 154s Selecting previously unselected package python3-comm. 154s Preparing to unpack .../37-python3-comm_0.2.1-1_all.deb ... 154s Unpacking python3-comm (0.2.1-1) ... 154s Selecting previously unselected package python3-bytecode. 154s Preparing to unpack .../38-python3-bytecode_0.15.1-3_all.deb ... 154s Unpacking python3-bytecode (0.15.1-3) ... 154s Selecting previously unselected package python3-coverage. 154s Preparing to unpack .../39-python3-coverage_7.4.4+dfsg1-0ubuntu2_s390x.deb ... 154s Unpacking python3-coverage (7.4.4+dfsg1-0ubuntu2) ... 154s Selecting previously unselected package python3-pydevd. 154s Preparing to unpack .../40-python3-pydevd_2.10.0+ds-10ubuntu1_s390x.deb ... 154s Unpacking python3-pydevd (2.10.0+ds-10ubuntu1) ... 154s Selecting previously unselected package python3-debugpy. 154s Preparing to unpack .../41-python3-debugpy_1.8.0+ds-4ubuntu4_all.deb ... 154s Unpacking python3-debugpy (1.8.0+ds-4ubuntu4) ... 154s Selecting previously unselected package python3-decorator. 154s Preparing to unpack .../42-python3-decorator_5.1.1-5_all.deb ... 154s Unpacking python3-decorator (5.1.1-5) ... 154s Selecting previously unselected package python3-parso. 154s Preparing to unpack .../43-python3-parso_0.8.3-1_all.deb ... 154s Unpacking python3-parso (0.8.3-1) ... 154s Selecting previously unselected package python3-typeshed. 154s Preparing to unpack .../44-python3-typeshed_0.0~git20231111.6764465-3_all.deb ... 154s Unpacking python3-typeshed (0.0~git20231111.6764465-3) ... 155s Selecting previously unselected package python3-jedi. 155s Preparing to unpack .../45-python3-jedi_0.19.1+ds1-1_all.deb ... 155s Unpacking python3-jedi (0.19.1+ds1-1) ... 155s Selecting previously unselected package python3-matplotlib-inline. 155s Preparing to unpack .../46-python3-matplotlib-inline_0.1.6-2_all.deb ... 155s Unpacking python3-matplotlib-inline (0.1.6-2) ... 155s Selecting previously unselected package python3-pexpect. 155s Preparing to unpack .../47-python3-pexpect_4.9-2_all.deb ... 155s Unpacking python3-pexpect (4.9-2) ... 155s Selecting previously unselected package python3-wcwidth. 155s Preparing to unpack .../48-python3-wcwidth_0.2.5+dfsg1-1.1ubuntu1_all.deb ... 155s Unpacking python3-wcwidth (0.2.5+dfsg1-1.1ubuntu1) ... 155s Selecting previously unselected package python3-prompt-toolkit. 155s Preparing to unpack .../49-python3-prompt-toolkit_3.0.46-1_all.deb ... 155s Unpacking python3-prompt-toolkit (3.0.46-1) ... 155s Selecting previously unselected package python3-asttokens. 155s Preparing to unpack .../50-python3-asttokens_2.4.1-1_all.deb ... 155s Unpacking python3-asttokens (2.4.1-1) ... 155s Selecting previously unselected package python3-executing. 155s Preparing to unpack .../51-python3-executing_2.0.1-0.1_all.deb ... 155s Unpacking python3-executing (2.0.1-0.1) ... 155s Selecting previously unselected package python3-pure-eval. 155s Preparing to unpack .../52-python3-pure-eval_0.2.2-2_all.deb ... 155s Unpacking python3-pure-eval (0.2.2-2) ... 155s Selecting previously unselected package python3-stack-data. 155s Preparing to unpack .../53-python3-stack-data_0.6.3-1_all.deb ... 155s Unpacking python3-stack-data (0.6.3-1) ... 155s Selecting previously unselected package python3-ipython. 155s Preparing to unpack .../54-python3-ipython_8.20.0-1ubuntu1_all.deb ... 155s Unpacking python3-ipython (8.20.0-1ubuntu1) ... 155s Selecting previously unselected package python3-dateutil. 155s Preparing to unpack .../55-python3-dateutil_2.9.0-2_all.deb ... 155s Unpacking python3-dateutil (2.9.0-2) ... 155s Selecting previously unselected package python3-entrypoints. 155s Preparing to unpack .../56-python3-entrypoints_0.4-2_all.deb ... 155s Unpacking python3-entrypoints (0.4-2) ... 155s Selecting previously unselected package python3-nest-asyncio. 155s Preparing to unpack .../57-python3-nest-asyncio_1.5.4-1_all.deb ... 155s Unpacking python3-nest-asyncio (1.5.4-1) ... 155s Selecting previously unselected package python3-py. 155s Preparing to unpack .../58-python3-py_1.11.0-2_all.deb ... 155s Unpacking python3-py (1.11.0-2) ... 155s Selecting previously unselected package libnorm1t64:s390x. 155s Preparing to unpack .../59-libnorm1t64_1.5.9+dfsg-3.1build1_s390x.deb ... 155s Unpacking libnorm1t64:s390x (1.5.9+dfsg-3.1build1) ... 155s Selecting previously unselected package libpgm-5.3-0t64:s390x. 155s Preparing to unpack .../60-libpgm-5.3-0t64_5.3.128~dfsg-2.1build1_s390x.deb ... 155s Unpacking libpgm-5.3-0t64:s390x (5.3.128~dfsg-2.1build1) ... 155s Selecting previously unselected package libsodium23:s390x. 155s Preparing to unpack .../61-libsodium23_1.0.18-1build3_s390x.deb ... 155s Unpacking libsodium23:s390x (1.0.18-1build3) ... 155s Selecting previously unselected package libzmq5:s390x. 155s Preparing to unpack .../62-libzmq5_4.3.5-1build2_s390x.deb ... 155s Unpacking libzmq5:s390x (4.3.5-1build2) ... 155s Selecting previously unselected package python3-zmq. 155s Preparing to unpack .../63-python3-zmq_24.0.1-5build1_s390x.deb ... 155s Unpacking python3-zmq (24.0.1-5build1) ... 156s Selecting previously unselected package python3-jupyter-client. 156s Preparing to unpack .../64-python3-jupyter-client_7.4.9-2ubuntu1_all.deb ... 156s Unpacking python3-jupyter-client (7.4.9-2ubuntu1) ... 156s Selecting previously unselected package python3-packaging. 156s Preparing to unpack .../65-python3-packaging_24.0-1_all.deb ... 156s Unpacking python3-packaging (24.0-1) ... 156s Selecting previously unselected package python3-psutil. 156s Preparing to unpack .../66-python3-psutil_5.9.8-2build2_s390x.deb ... 156s Unpacking python3-psutil (5.9.8-2build2) ... 156s Selecting previously unselected package python3-ipykernel. 156s Preparing to unpack .../67-python3-ipykernel_6.29.3-1ubuntu1_all.deb ... 156s Unpacking python3-ipykernel (6.29.3-1ubuntu1) ... 156s Selecting previously unselected package python3-ipython-genutils. 156s Preparing to unpack .../68-python3-ipython-genutils_0.2.0-6_all.deb ... 156s Unpacking python3-ipython-genutils (0.2.0-6) ... 156s Selecting previously unselected package python3-webencodings. 156s Preparing to unpack .../69-python3-webencodings_0.5.1-5_all.deb ... 156s Unpacking python3-webencodings (0.5.1-5) ... 156s Selecting previously unselected package python3-html5lib. 156s Preparing to unpack .../70-python3-html5lib_1.1-6_all.deb ... 156s Unpacking python3-html5lib (1.1-6) ... 156s Selecting previously unselected package python3-bleach. 156s Preparing to unpack .../71-python3-bleach_6.1.0-2_all.deb ... 156s Unpacking python3-bleach (6.1.0-2) ... 156s Selecting previously unselected package python3-soupsieve. 156s Preparing to unpack .../72-python3-soupsieve_2.5-1_all.deb ... 156s Unpacking python3-soupsieve (2.5-1) ... 156s Selecting previously unselected package python3-bs4. 156s Preparing to unpack .../73-python3-bs4_4.12.3-1_all.deb ... 156s Unpacking python3-bs4 (4.12.3-1) ... 156s Selecting previously unselected package python3-defusedxml. 156s Preparing to unpack .../74-python3-defusedxml_0.7.1-2_all.deb ... 156s Unpacking python3-defusedxml (0.7.1-2) ... 156s Selecting previously unselected package python3-jupyterlab-pygments. 156s Preparing to unpack .../75-python3-jupyterlab-pygments_0.2.2-3_all.deb ... 156s Unpacking python3-jupyterlab-pygments (0.2.2-3) ... 156s Selecting previously unselected package python3-mistune. 156s Preparing to unpack .../76-python3-mistune_3.0.2-1_all.deb ... 156s Unpacking python3-mistune (3.0.2-1) ... 156s Selecting previously unselected package python3-fastjsonschema. 156s Preparing to unpack .../77-python3-fastjsonschema_2.19.1-1_all.deb ... 156s Unpacking python3-fastjsonschema (2.19.1-1) ... 156s Selecting previously unselected package python3-nbformat. 156s Preparing to unpack .../78-python3-nbformat_5.9.1-1_all.deb ... 156s Unpacking python3-nbformat (5.9.1-1) ... 156s Selecting previously unselected package python3-nbclient. 156s Preparing to unpack .../79-python3-nbclient_0.8.0-1_all.deb ... 156s Unpacking python3-nbclient (0.8.0-1) ... 156s Selecting previously unselected package python3-pandocfilters. 156s Preparing to unpack .../80-python3-pandocfilters_1.5.1-1_all.deb ... 156s Unpacking python3-pandocfilters (1.5.1-1) ... 156s Selecting previously unselected package python-tinycss2-common. 156s Preparing to unpack .../81-python-tinycss2-common_1.3.0-1_all.deb ... 156s Unpacking python-tinycss2-common (1.3.0-1) ... 156s Selecting previously unselected package python3-tinycss2. 156s Preparing to unpack .../82-python3-tinycss2_1.3.0-1_all.deb ... 156s Unpacking python3-tinycss2 (1.3.0-1) ... 156s Selecting previously unselected package python3-nbconvert. 156s Preparing to unpack .../83-python3-nbconvert_7.16.4-1_all.deb ... 156s Unpacking python3-nbconvert (7.16.4-1) ... 156s Selecting previously unselected package python3-prometheus-client. 156s Preparing to unpack .../84-python3-prometheus-client_0.19.0+ds1-1_all.deb ... 156s Unpacking python3-prometheus-client (0.19.0+ds1-1) ... 156s Selecting previously unselected package python3-send2trash. 156s Preparing to unpack .../85-python3-send2trash_1.8.2-1_all.deb ... 156s Unpacking python3-send2trash (1.8.2-1) ... 156s Selecting previously unselected package python3-notebook. 156s Preparing to unpack .../86-python3-notebook_6.4.12-2.2ubuntu1_all.deb ... 156s Unpacking python3-notebook (6.4.12-2.2ubuntu1) ... 156s Selecting previously unselected package jupyter-notebook. 156s Preparing to unpack .../87-jupyter-notebook_6.4.12-2.2ubuntu1_all.deb ... 156s Unpacking jupyter-notebook (6.4.12-2.2ubuntu1) ... 156s Selecting previously unselected package libjs-sphinxdoc. 156s Preparing to unpack .../88-libjs-sphinxdoc_7.2.6-8_all.deb ... 156s Unpacking libjs-sphinxdoc (7.2.6-8) ... 156s Selecting previously unselected package sphinx-rtd-theme-common. 156s Preparing to unpack .../89-sphinx-rtd-theme-common_2.0.0+dfsg-1_all.deb ... 156s Unpacking sphinx-rtd-theme-common (2.0.0+dfsg-1) ... 156s Selecting previously unselected package python-notebook-doc. 156s Preparing to unpack .../90-python-notebook-doc_6.4.12-2.2ubuntu1_all.deb ... 156s Unpacking python-notebook-doc (6.4.12-2.2ubuntu1) ... 156s Selecting previously unselected package python3-iniconfig. 156s Preparing to unpack .../91-python3-iniconfig_1.1.1-2_all.deb ... 156s Unpacking python3-iniconfig (1.1.1-2) ... 156s Selecting previously unselected package python3-pluggy. 156s Preparing to unpack .../92-python3-pluggy_1.5.0-1_all.deb ... 156s Unpacking python3-pluggy (1.5.0-1) ... 156s Selecting previously unselected package python3-pytest. 156s Preparing to unpack .../93-python3-pytest_7.4.4-1_all.deb ... 156s Unpacking python3-pytest (7.4.4-1) ... 156s Selecting previously unselected package python3-requests-unixsocket. 156s Preparing to unpack .../94-python3-requests-unixsocket_0.3.0-4_all.deb ... 156s Unpacking python3-requests-unixsocket (0.3.0-4) ... 156s Selecting previously unselected package autopkgtest-satdep. 156s Preparing to unpack .../95-1-autopkgtest-satdep.deb ... 156s Unpacking autopkgtest-satdep (0) ... 156s Setting up python3-entrypoints (0.4-2) ... 157s Setting up libjs-jquery-typeahead (2.11.0+dfsg1-3) ... 157s Setting up python3-iniconfig (1.1.1-2) ... 157s Setting up python3-tornado (6.4.1-1) ... 157s Setting up libnorm1t64:s390x (1.5.9+dfsg-3.1build1) ... 157s Setting up python3-pure-eval (0.2.2-2) ... 157s Setting up python3-send2trash (1.8.2-1) ... 157s Setting up fonts-lato (2.015-1) ... 157s Setting up fonts-mathjax (2.7.9+dfsg-1) ... 157s Setting up libsodium23:s390x (1.0.18-1build3) ... 157s Setting up libjs-mathjax (2.7.9+dfsg-1) ... 157s Setting up python3-py (1.11.0-2) ... 158s Setting up libdebuginfod-common (0.191-1) ... 158s Setting up libjs-requirejs-text (2.0.12-1.1) ... 158s Setting up python3-parso (0.8.3-1) ... 158s Setting up python3-defusedxml (0.7.1-2) ... 158s Setting up python3-ipython-genutils (0.2.0-6) ... 158s Setting up python3-asttokens (2.4.1-1) ... 158s Setting up fonts-glyphicons-halflings (1.009~3.4.1+dfsg-3) ... 158s Setting up python3-coverage (7.4.4+dfsg1-0ubuntu2) ... 159s Setting up libjs-moment (2.29.4+ds-1) ... 159s Setting up python3-pandocfilters (1.5.1-1) ... 159s Setting up libjs-requirejs (2.3.6+ds+~2.1.37-1) ... 159s Setting up libjs-es6-promise (4.2.8-12) ... 159s Setting up libjs-text-encoding (0.7.0-5) ... 159s Setting up python3-webencodings (0.5.1-5) ... 159s Setting up python3-platformdirs (4.2.1-1) ... 159s Setting up python3-psutil (5.9.8-2build2) ... 159s Setting up libsource-highlight-common (3.1.9-4.3build1) ... 159s Setting up python3-requests-unixsocket (0.3.0-4) ... 160s Setting up python3-jupyterlab-pygments (0.2.2-3) ... 160s Setting up libpython3.12t64:s390x (3.12.4-1) ... 160s Setting up libpgm-5.3-0t64:s390x (5.3.128~dfsg-2.1build1) ... 160s Setting up python3-decorator (5.1.1-5) ... 160s Setting up python3-packaging (24.0-1) ... 160s Setting up python3-wcwidth (0.2.5+dfsg1-1.1ubuntu1) ... 160s Setting up node-jed (1.1.1-4) ... 160s Setting up python3-typeshed (0.0~git20231111.6764465-3) ... 160s Setting up python3-executing (2.0.1-0.1) ... 160s Setting up libjs-xterm (5.3.0-2) ... 160s Setting up python3-nest-asyncio (1.5.4-1) ... 160s Setting up python3-bytecode (0.15.1-3) ... 161s Setting up libjs-codemirror (5.65.0+~cs5.83.9-3) ... 161s Setting up libjs-jed (1.1.1-4) ... 161s Setting up python3-html5lib (1.1-6) ... 161s Setting up libbabeltrace1:s390x (1.5.11-3build3) ... 161s Setting up python3-pluggy (1.5.0-1) ... 161s Setting up python3-fastjsonschema (2.19.1-1) ... 161s Setting up python3-traitlets (5.14.3-1) ... 161s Setting up python-tinycss2-common (1.3.0-1) ... 161s Setting up python3-argon2 (21.1.0-2build1) ... 162s Setting up python3-dateutil (2.9.0-2) ... 162s Setting up libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 162s Setting up python3-mistune (3.0.2-1) ... 162s Setting up python3-stack-data (0.6.3-1) ... 162s Setting up python3-soupsieve (2.5-1) ... 163s Setting up fonts-font-awesome (5.0.10+really4.7.0~dfsg-4.1) ... 163s Setting up sphinx-rtd-theme-common (2.0.0+dfsg-1) ... 163s Setting up python3-jupyter-core (5.3.2-2) ... 163s Setting up libjs-bootstrap (3.4.1+dfsg-3) ... 163s Setting up libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... 163s Setting up python3-ptyprocess (0.7.0-5) ... 163s Setting up libjs-marked (4.2.3+ds+~4.0.7-3) ... 163s Setting up python3-prompt-toolkit (3.0.46-1) ... 164s Setting up libdebuginfod1t64:s390x (0.191-1) ... 164s Setting up python3-tinycss2 (1.3.0-1) ... 164s Setting up libzmq5:s390x (4.3.5-1build2) ... 164s Setting up python3-jedi (0.19.1+ds1-1) ... 164s Setting up python3-pytest (7.4.4-1) ... 164s Setting up libjs-bootstrap-tour (0.12.0+dfsg-5) ... 164s Setting up libjs-backbone (1.4.1~dfsg+~1.4.15-3) ... 164s Setting up libsource-highlight4t64:s390x (3.1.9-4.3build1) ... 164s Setting up python3-nbformat (5.9.1-1) ... 165s Setting up python3-bs4 (4.12.3-1) ... 165s Setting up python3-bleach (6.1.0-2) ... 165s Setting up python3-matplotlib-inline (0.1.6-2) ... 165s Setting up python3-comm (0.2.1-1) ... 165s Setting up python3-prometheus-client (0.19.0+ds1-1) ... 166s Setting up gdb (15.0.50.20240403-0ubuntu1) ... 166s Setting up libjs-jquery-ui (1.13.2+dfsg-1) ... 166s Setting up python3-pexpect (4.9-2) ... 166s Setting up python3-zmq (24.0.1-5build1) ... 166s Setting up libjs-sphinxdoc (7.2.6-8) ... 166s Setting up python3-terminado (0.18.1-1) ... 166s Setting up python3-jupyter-client (7.4.9-2ubuntu1) ... 166s Setting up jupyter-core (5.3.2-2) ... 166s Setting up python3-pydevd (2.10.0+ds-10ubuntu1) ... 167s Setting up python3-debugpy (1.8.0+ds-4ubuntu4) ... 167s Setting up python-notebook-doc (6.4.12-2.2ubuntu1) ... 167s Setting up python3-nbclient (0.8.0-1) ... 168s Setting up python3-ipython (8.20.0-1ubuntu1) ... 168s Setting up python3-ipykernel (6.29.3-1ubuntu1) ... 168s Setting up python3-nbconvert (7.16.4-1) ... 169s Setting up python3-notebook (6.4.12-2.2ubuntu1) ... 169s Setting up jupyter-notebook (6.4.12-2.2ubuntu1) ... 169s Setting up autopkgtest-satdep (0) ... 169s Processing triggers for man-db (2.12.1-2) ... 170s Processing triggers for libc-bin (2.39-0ubuntu9) ... 174s (Reading database ... 71286 files and directories currently installed.) 174s Removing autopkgtest-satdep (0) ... 175s autopkgtest [10:27:03]: test pytest: [----------------------- 177s ============================= test session starts ============================== 177s platform linux -- Python 3.12.4, pytest-7.4.4, pluggy-1.5.0 177s rootdir: /tmp/autopkgtest.s4beMp/build.uMz/src 177s collected 330 items / 5 deselected / 325 selected 177s 178s notebook/auth/tests/test_login.py EE [ 0%] 179s notebook/auth/tests/test_security.py .... [ 1%] 180s notebook/bundler/tests/test_bundler_api.py EEEEE [ 3%] 180s notebook/bundler/tests/test_bundler_tools.py ............. [ 7%] 180s notebook/bundler/tests/test_bundlerextension.py ... [ 8%] 180s notebook/nbconvert/tests/test_nbconvert_handlers.py ssssss [ 10%] 181s notebook/services/api/tests/test_api.py EEE [ 11%] 181s notebook/services/config/tests/test_config_api.py EEE [ 12%] 183s notebook/services/contents/tests/test_contents_api.py EsEEEEEEEEEEssEEsE [ 17%] 192s EEEEEEEEEEEEEEEEEEEEEEEEEsEEEEEEEEEEEssEEsEEEEEEEEEEEEEEEEEEEEEEEEE [ 38%] 192s notebook/services/contents/tests/test_fileio.py ... [ 39%] 192s notebook/services/contents/tests/test_largefilemanager.py . [ 39%] 193s notebook/services/contents/tests/test_manager.py .....s........ss....... [ 46%] 193s ...ss........ [ 50%] 195s notebook/services/kernels/tests/test_kernels_api.py EEEEEEEEEEEE [ 54%] 196s notebook/services/kernelspecs/tests/test_kernelspecs_api.py EEEEEEE [ 56%] 196s notebook/services/nbconvert/tests/test_nbconvert_api.py E [ 56%] 198s notebook/services/sessions/tests/test_sessionmanager.py FFFFFFFFF [ 59%] 200s notebook/services/sessions/tests/test_sessions_api.py EEEEEEEEEEEEEEEEEE [ 64%] 201s EEEE [ 66%] 202s notebook/terminal/tests/test_terminals_api.py EEEEEEEE [ 68%] 202s notebook/tests/test_config_manager.py . [ 68%] 203s notebook/tests/test_files.py EEEEE [ 70%] 204s notebook/tests/test_gateway.py EEEEEE [ 72%] 204s notebook/tests/test_i18n.py . [ 72%] 204s notebook/tests/test_log.py . [ 72%] 205s notebook/tests/test_nbextensions.py ................................... [ 83%] 209s notebook/tests/test_notebookapp.py FFFFFFFFF........F.EEEEEEE [ 91%] 209s notebook/tests/test_paths.py ..E [ 92%] 209s notebook/tests/test_serialize.py .. [ 93%] 210s notebook/tests/test_serverextensions.py ...FF [ 94%] 210s notebook/tests/test_traittypes.py ........... [ 98%] 211s notebook/tests/test_utils.py F...s [ 99%] 211s notebook/tree/tests/test_tree_handler.py E [100%] 211s 211s ==================================== ERRORS ==================================== 211s __________________ ERROR at setup of LoginTest.test_next_bad ___________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ___________________ ERROR at setup of LoginTest.test_next_ok ___________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s __________ ERROR at setup of BundleAPITest.test_bundler_import_error ___________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s """Make a test notebook. Borrowed from nbconvert test. Assumes the class 211s teardown will clean it up in the end.""" 211s > super().setup_class() 211s 211s notebook/bundler/tests/test_bundler_api.py:27: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:198: in setup_class 211s cls.wait_until_alive() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _____________ ERROR at setup of BundleAPITest.test_bundler_invoke ______________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s """Make a test notebook. Borrowed from nbconvert test. Assumes the class 211s teardown will clean it up in the end.""" 211s > super().setup_class() 211s 211s notebook/bundler/tests/test_bundler_api.py:27: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:198: in setup_class 211s cls.wait_until_alive() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ___________ ERROR at setup of BundleAPITest.test_bundler_not_enabled ___________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s """Make a test notebook. Borrowed from nbconvert test. Assumes the class 211s teardown will clean it up in the end.""" 211s > super().setup_class() 211s 211s notebook/bundler/tests/test_bundler_api.py:27: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:198: in setup_class 211s cls.wait_until_alive() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ___________ ERROR at setup of BundleAPITest.test_missing_bundler_arg ___________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s """Make a test notebook. Borrowed from nbconvert test. Assumes the class 211s teardown will clean it up in the end.""" 211s > super().setup_class() 211s 211s notebook/bundler/tests/test_bundler_api.py:27: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:198: in setup_class 211s cls.wait_until_alive() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ___________ ERROR at setup of BundleAPITest.test_notebook_not_found ____________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s """Make a test notebook. Borrowed from nbconvert test. Assumes the class 211s teardown will clean it up in the end.""" 211s > super().setup_class() 211s 211s notebook/bundler/tests/test_bundler_api.py:27: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:198: in setup_class 211s cls.wait_until_alive() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ___________________ ERROR at setup of APITest.test_get_spec ____________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s __________________ ERROR at setup of APITest.test_get_status ___________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _______________ ERROR at setup of APITest.test_no_track_activity _______________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ____________ ERROR at setup of APITest.test_create_retrieve_config _____________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s __________________ ERROR at setup of APITest.test_get_unknown __________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ____________________ ERROR at setup of APITest.test_modify _____________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s __________________ ERROR at setup of APITest.test_checkpoints __________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ___________ ERROR at setup of APITest.test_checkpoints_separate_root ___________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _____________________ ERROR at setup of APITest.test_copy ______________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ________________ ERROR at setup of APITest.test_copy_400_hidden ________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ___________________ ERROR at setup of APITest.test_copy_copy ___________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _________________ ERROR at setup of APITest.test_copy_dir_400 __________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ___________________ ERROR at setup of APITest.test_copy_path ___________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _________________ ERROR at setup of APITest.test_copy_put_400 __________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ______________ ERROR at setup of APITest.test_copy_put_400_hidden ______________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ________________ ERROR at setup of APITest.test_create_untitled ________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ______________ ERROR at setup of APITest.test_create_untitled_txt ______________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _______________ ERROR at setup of APITest.test_delete_hidden_dir _______________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ______________ ERROR at setup of APITest.test_delete_hidden_file _______________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _______________ ERROR at setup of APITest.test_file_checkpoints ________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ________________ ERROR at setup of APITest.test_get_404_hidden _________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _________________ ERROR at setup of APITest.test_get_bad_type __________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ___________ ERROR at setup of APITest.test_get_binary_file_contents ____________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ___________ ERROR at setup of APITest.test_get_contents_no_such_file ___________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ______________ ERROR at setup of APITest.test_get_dir_no_content _______________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ________________ ERROR at setup of APITest.test_get_nb_contents ________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ________________ ERROR at setup of APITest.test_get_nb_invalid _________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _______________ ERROR at setup of APITest.test_get_nb_no_content _______________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ____________ ERROR at setup of APITest.test_get_text_file_contents _____________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ___________________ ERROR at setup of APITest.test_list_dirs ___________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _____________ ERROR at setup of APITest.test_list_nonexistant_dir ______________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ________________ ERROR at setup of APITest.test_list_notebooks _________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _____________________ ERROR at setup of APITest.test_mkdir _____________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _______________ ERROR at setup of APITest.test_mkdir_hidden_400 ________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ________________ ERROR at setup of APITest.test_mkdir_untitled _________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ____________________ ERROR at setup of APITest.test_rename _____________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _______________ ERROR at setup of APITest.test_rename_400_hidden _______________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ________________ ERROR at setup of APITest.test_rename_existing ________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _____________________ ERROR at setup of APITest.test_save ______________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ____________________ ERROR at setup of APITest.test_upload _____________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s __________________ ERROR at setup of APITest.test_upload_b64 ___________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s __________________ ERROR at setup of APITest.test_upload_txt ___________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _______________ ERROR at setup of APITest.test_upload_txt_hidden _______________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ___________________ ERROR at setup of APITest.test_upload_v2 ___________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _______ ERROR at setup of GenericFileCheckpointsAPITest.test_checkpoints _______ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _ ERROR at setup of GenericFileCheckpointsAPITest.test_checkpoints_separate_root _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s __ ERROR at setup of GenericFileCheckpointsAPITest.test_config_did_something ___ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s __________ ERROR at setup of GenericFileCheckpointsAPITest.test_copy ___________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _____ ERROR at setup of GenericFileCheckpointsAPITest.test_copy_400_hidden _____ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ________ ERROR at setup of GenericFileCheckpointsAPITest.test_copy_copy ________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ______ ERROR at setup of GenericFileCheckpointsAPITest.test_copy_dir_400 _______ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ________ ERROR at setup of GenericFileCheckpointsAPITest.test_copy_path ________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ______ ERROR at setup of GenericFileCheckpointsAPITest.test_copy_put_400 _______ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ___ ERROR at setup of GenericFileCheckpointsAPITest.test_copy_put_400_hidden ___ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _____ ERROR at setup of GenericFileCheckpointsAPITest.test_create_untitled _____ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ___ ERROR at setup of GenericFileCheckpointsAPITest.test_create_untitled_txt ___ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ____ ERROR at setup of GenericFileCheckpointsAPITest.test_delete_hidden_dir ____ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ___ ERROR at setup of GenericFileCheckpointsAPITest.test_delete_hidden_file ____ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ____ ERROR at setup of GenericFileCheckpointsAPITest.test_file_checkpoints _____ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _____ ERROR at setup of GenericFileCheckpointsAPITest.test_get_404_hidden ______ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ______ ERROR at setup of GenericFileCheckpointsAPITest.test_get_bad_type _______ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _ ERROR at setup of GenericFileCheckpointsAPITest.test_get_binary_file_contents _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _ ERROR at setup of GenericFileCheckpointsAPITest.test_get_contents_no_such_file _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ___ ERROR at setup of GenericFileCheckpointsAPITest.test_get_dir_no_content ____ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _____ ERROR at setup of GenericFileCheckpointsAPITest.test_get_nb_contents _____ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _____ ERROR at setup of GenericFileCheckpointsAPITest.test_get_nb_invalid ______ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ____ ERROR at setup of GenericFileCheckpointsAPITest.test_get_nb_no_content ____ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _ ERROR at setup of GenericFileCheckpointsAPITest.test_get_text_file_contents __ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ________ ERROR at setup of GenericFileCheckpointsAPITest.test_list_dirs ________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s __ ERROR at setup of GenericFileCheckpointsAPITest.test_list_nonexistant_dir ___ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _____ ERROR at setup of GenericFileCheckpointsAPITest.test_list_notebooks ______ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s __________ ERROR at setup of GenericFileCheckpointsAPITest.test_mkdir __________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ____ ERROR at setup of GenericFileCheckpointsAPITest.test_mkdir_hidden_400 _____ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _____ ERROR at setup of GenericFileCheckpointsAPITest.test_mkdir_untitled ______ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _________ ERROR at setup of GenericFileCheckpointsAPITest.test_rename __________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ____ ERROR at setup of GenericFileCheckpointsAPITest.test_rename_400_hidden ____ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _____ ERROR at setup of GenericFileCheckpointsAPITest.test_rename_existing _____ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s __________ ERROR at setup of GenericFileCheckpointsAPITest.test_save ___________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _________ ERROR at setup of GenericFileCheckpointsAPITest.test_upload __________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _______ ERROR at setup of GenericFileCheckpointsAPITest.test_upload_b64 ________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _______ ERROR at setup of GenericFileCheckpointsAPITest.test_upload_txt ________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ____ ERROR at setup of GenericFileCheckpointsAPITest.test_upload_txt_hidden ____ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ________ ERROR at setup of GenericFileCheckpointsAPITest.test_upload_v2 ________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _______________ ERROR at setup of KernelAPITest.test_connections _______________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _____________ ERROR at setup of KernelAPITest.test_default_kernel ______________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _____________ ERROR at setup of KernelAPITest.test_kernel_handler ______________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ___________ ERROR at setup of KernelAPITest.test_main_kernel_handler ___________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _______________ ERROR at setup of KernelAPITest.test_no_kernels ________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ____________ ERROR at setup of AsyncKernelAPITest.test_connections _____________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s if not async_testing_enabled: # Can be removed once jupyter_client >= 6.1 is required. 211s raise SkipTest("AsyncKernelAPITest tests skipped due to down-level jupyter_client!") 211s > super().setup_class() 211s 211s notebook/services/kernels/tests/test_kernels_api.py:206: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:198: in setup_class 211s cls.wait_until_alive() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ___________ ERROR at setup of AsyncKernelAPITest.test_default_kernel ___________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s if not async_testing_enabled: # Can be removed once jupyter_client >= 6.1 is required. 211s raise SkipTest("AsyncKernelAPITest tests skipped due to down-level jupyter_client!") 211s > super().setup_class() 211s 211s notebook/services/kernels/tests/test_kernels_api.py:206: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:198: in setup_class 211s cls.wait_until_alive() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ___________ ERROR at setup of AsyncKernelAPITest.test_kernel_handler ___________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s if not async_testing_enabled: # Can be removed once jupyter_client >= 6.1 is required. 211s raise SkipTest("AsyncKernelAPITest tests skipped due to down-level jupyter_client!") 211s > super().setup_class() 211s 211s notebook/services/kernels/tests/test_kernels_api.py:206: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:198: in setup_class 211s cls.wait_until_alive() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ________ ERROR at setup of AsyncKernelAPITest.test_main_kernel_handler _________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s if not async_testing_enabled: # Can be removed once jupyter_client >= 6.1 is required. 211s raise SkipTest("AsyncKernelAPITest tests skipped due to down-level jupyter_client!") 211s > super().setup_class() 211s 211s notebook/services/kernels/tests/test_kernels_api.py:206: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:198: in setup_class 211s cls.wait_until_alive() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _____________ ERROR at setup of AsyncKernelAPITest.test_no_kernels _____________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s if not async_testing_enabled: # Can be removed once jupyter_client >= 6.1 is required. 211s raise SkipTest("AsyncKernelAPITest tests skipped due to down-level jupyter_client!") 211s > super().setup_class() 211s 211s notebook/services/kernels/tests/test_kernels_api.py:206: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:198: in setup_class 211s cls.wait_until_alive() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ________________ ERROR at setup of KernelFilterTest.test_config ________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _______________ ERROR at setup of KernelCullingTest.test_culling _______________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ___________ ERROR at setup of APITest.test_get_kernel_resource_file ____________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ________________ ERROR at setup of APITest.test_get_kernelspec _________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _____________ ERROR at setup of APITest.test_get_kernelspec_spaces _____________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s __________ ERROR at setup of APITest.test_get_nonexistant_kernelspec ___________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ___________ ERROR at setup of APITest.test_get_nonexistant_resource ____________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _______________ ERROR at setup of APITest.test_list_kernelspecs ________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _____________ ERROR at setup of APITest.test_list_kernelspecs_bad ______________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _________________ ERROR at setup of APITest.test_list_formats __________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _________________ ERROR at setup of SessionAPITest.test_create _________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _________ ERROR at setup of SessionAPITest.test_create_console_session _________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ___________ ERROR at setup of SessionAPITest.test_create_deprecated ____________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s __________ ERROR at setup of SessionAPITest.test_create_file_session ___________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _________ ERROR at setup of SessionAPITest.test_create_with_kernel_id __________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _________________ ERROR at setup of SessionAPITest.test_delete _________________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ____________ ERROR at setup of SessionAPITest.test_modify_kernel_id ____________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ___________ ERROR at setup of SessionAPITest.test_modify_kernel_name ___________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ______________ ERROR at setup of SessionAPITest.test_modify_path _______________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s _________ ERROR at setup of SessionAPITest.test_modify_path_deprecated _________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ______________ ERROR at setup of SessionAPITest.test_modify_type _______________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s cls.tmp_dir = TemporaryDirectory() 211s def tmp(*parts): 211s path = os.path.join(cls.tmp_dir.name, *parts) 211s try: 211s os.makedirs(path) 211s except OSError as e: 211s if e.errno != errno.EEXIST: 211s raise 211s return path 211s 211s cls.home_dir = tmp('home') 211s data_dir = cls.data_dir = tmp('data') 211s config_dir = cls.config_dir = tmp('config') 211s runtime_dir = cls.runtime_dir = tmp('runtime') 211s cls.notebook_dir = tmp('notebooks') 211s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 211s cls.env_patch.start() 211s # Patch systemwide & user-wide data & config directories, to isolate 211s # the tests from oddities of the local setup. But leave Python env 211s # locations alone, so data files for e.g. nbconvert are accessible. 211s # If this isolation isn't sufficient, you may need to run the tests in 211s # a virtualenv or conda env. 211s cls.path_patch = patch.multiple( 211s jupyter_core.paths, 211s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 211s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 211s ) 211s cls.path_patch.start() 211s 211s config = cls.config or Config() 211s config.NotebookNotary.db_file = ':memory:' 211s 211s cls.token = hexlify(os.urandom(4)).decode('ascii') 211s 211s started = Event() 211s def start_thread(): 211s try: 211s bind_args = cls.get_bind_args() 211s app = cls.notebook = NotebookApp( 211s port_retries=0, 211s open_browser=False, 211s config_dir=cls.config_dir, 211s data_dir=cls.data_dir, 211s runtime_dir=cls.runtime_dir, 211s notebook_dir=cls.notebook_dir, 211s base_url=cls.url_prefix, 211s config=config, 211s allow_root=True, 211s token=cls.token, 211s **bind_args 211s ) 211s if "asyncio" in sys.modules: 211s app._init_asyncio_patch() 211s import asyncio 211s 211s asyncio.set_event_loop(asyncio.new_event_loop()) 211s # Patch the current loop in order to match production 211s # behavior 211s import nest_asyncio 211s 211s nest_asyncio.apply() 211s # don't register signal handler during tests 211s app.init_signal = lambda : None 211s # clear log handlers and propagate to root for nose to capture it 211s # needs to be redone after initialize, which reconfigures logging 211s app.log.propagate = True 211s app.log.handlers = [] 211s app.initialize(argv=cls.get_argv()) 211s app.log.propagate = True 211s app.log.handlers = [] 211s loop = IOLoop.current() 211s loop.add_callback(started.set) 211s app.start() 211s finally: 211s # set the event, so failure to start doesn't cause a hang 211s started.set() 211s app.session_manager.close() 211s cls.notebook_thread = Thread(target=start_thread) 211s cls.notebook_thread.daemon = True 211s cls.notebook_thread.start() 211s started.wait() 211s > cls.wait_until_alive() 211s 211s notebook/tests/launchnotebook.py:198: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ______________ ERROR at setup of AsyncSessionAPITest.test_create _______________ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s except (ProtocolError, OSError) as err: 211s raise ConnectionError(err, request=request) 211s 211s except MaxRetryError as e: 211s if isinstance(e.reason, ConnectTimeoutError): 211s # TODO: Remove this in 3.0.0: see #2811 211s if not isinstance(e.reason, NewConnectionError): 211s raise ConnectTimeout(e, request=request) 211s 211s if isinstance(e.reason, ResponseError): 211s raise RetryError(e, request=request) 211s 211s if isinstance(e.reason, _ProxyError): 211s raise ProxyError(e, request=request) 211s 211s if isinstance(e.reason, _SSLError): 211s # This branch is for urllib3 v1.22 and later. 211s raise SSLError(e, request=request) 211s 211s > raise ConnectionError(e, request=request) 211s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s cls = 211s 211s @classmethod 211s def setup_class(cls): 211s if not async_testing_enabled: # Can be removed once jupyter_client >= 6.1 is required. 211s raise SkipTest("AsyncSessionAPITest tests skipped due to down-level jupyter_client!") 211s > super().setup_class() 211s 211s notebook/services/sessions/tests/test_sessions_api.py:274: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:198: in setup_class 211s cls.wait_until_alive() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s cls.fetch_url(url) 211s except ModuleNotFoundError as error: 211s # Errors that should be immediately thrown back to caller 211s raise error 211s except Exception as e: 211s if not cls.notebook_thread.is_alive(): 211s > raise RuntimeError("The notebook server failed to start") from e 211s E RuntimeError: The notebook server failed to start 211s 211s notebook/tests/launchnotebook.py:59: RuntimeError 211s ______ ERROR at setup of AsyncSessionAPITest.test_create_console_session _______ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s > sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 211s raise err 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s address = ('localhost', 12341), timeout = None, source_address = None 211s socket_options = [(6, 1, 1)] 211s 211s def create_connection( 211s address: tuple[str, int], 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s source_address: tuple[str, int] | None = None, 211s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 211s ) -> socket.socket: 211s """Connect to *address* and return the socket object. 211s 211s Convenience function. Connect to *address* (a 2-tuple ``(host, 211s port)``) and return the socket object. Passing the optional 211s *timeout* parameter will set the timeout on the socket instance 211s before attempting to connect. If no *timeout* is supplied, the 211s global default timeout setting returned by :func:`socket.getdefaulttimeout` 211s is used. If *source_address* is set it must be a tuple of (host, port) 211s for the socket to bind as a source address before making the connection. 211s An host of '' or port 0 tells the OS to use the default. 211s """ 211s 211s host, port = address 211s if host.startswith("["): 211s host = host.strip("[]") 211s err = None 211s 211s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 211s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 211s # The original create_connection function always returns all records. 211s family = allowed_gai_family() 211s 211s try: 211s host.encode("idna") 211s except UnicodeError: 211s raise LocationParseError(f"'{host}', label empty or too long") from None 211s 211s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 211s af, socktype, proto, canonname, sa = res 211s sock = None 211s try: 211s sock = socket.socket(af, socktype, proto) 211s 211s # If provided, set socket level options before connecting. 211s _set_socket_options(sock, socket_options) 211s 211s if timeout is not _DEFAULT_TIMEOUT: 211s sock.settimeout(timeout) 211s if source_address: 211s sock.bind(source_address) 211s > sock.connect(sa) 211s E ConnectionRefusedError: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s method = 'GET', url = '/a%40b/api/contents', body = None 211s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 211s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s redirect = False, assert_same_host = False 211s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 211s release_conn = False, chunked = False, body_pos = None, preload_content = False 211s decode_content = False, response_kw = {} 211s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 211s destination_scheme = None, conn = None, release_this_conn = True 211s http_tunnel_required = False, err = None, clean_exit = False 211s 211s def urlopen( # type: ignore[override] 211s self, 211s method: str, 211s url: str, 211s body: _TYPE_BODY | None = None, 211s headers: typing.Mapping[str, str] | None = None, 211s retries: Retry | bool | int | None = None, 211s redirect: bool = True, 211s assert_same_host: bool = True, 211s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 211s pool_timeout: int | None = None, 211s release_conn: bool | None = None, 211s chunked: bool = False, 211s body_pos: _TYPE_BODY_POSITION | None = None, 211s preload_content: bool = True, 211s decode_content: bool = True, 211s **response_kw: typing.Any, 211s ) -> BaseHTTPResponse: 211s """ 211s Get a connection from the pool and perform an HTTP request. This is the 211s lowest level call for making a request, so you'll need to specify all 211s the raw details. 211s 211s .. note:: 211s 211s More commonly, it's appropriate to use a convenience method 211s such as :meth:`request`. 211s 211s .. note:: 211s 211s `release_conn` will only behave as expected if 211s `preload_content=False` because we want to make 211s `preload_content=False` the default behaviour someday soon without 211s breaking backwards compatibility. 211s 211s :param method: 211s HTTP request method (such as GET, POST, PUT, etc.) 211s 211s :param url: 211s The URL to perform the request on. 211s 211s :param body: 211s Data to send in the request body, either :class:`str`, :class:`bytes`, 211s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 211s 211s :param headers: 211s Dictionary of custom headers to send, such as User-Agent, 211s If-None-Match, etc. If None, pool headers are used. If provided, 211s these headers completely replace any pool-specific headers. 211s 211s :param retries: 211s Configure the number of retries to allow before raising a 211s :class:`~urllib3.exceptions.MaxRetryError` exception. 211s 211s Pass ``None`` to retry until you receive a response. Pass a 211s :class:`~urllib3.util.retry.Retry` object for fine-grained control 211s over different types of retries. 211s Pass an integer number to retry connection errors that many times, 211s but no other types of errors. Pass zero to never retry. 211s 211s If ``False``, then retries are disabled and any exception is raised 211s immediately. Also, instead of raising a MaxRetryError on redirects, 211s the redirect response will be returned. 211s 211s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 211s 211s :param redirect: 211s If True, automatically handle redirects (status codes 301, 302, 211s 303, 307, 308). Each redirect counts as a retry. Disabling retries 211s will disable redirect, too. 211s 211s :param assert_same_host: 211s If ``True``, will make sure that the host of the pool requests is 211s consistent else will raise HostChangedError. When ``False``, you can 211s use the pool on an HTTP proxy and request foreign hosts. 211s 211s :param timeout: 211s If specified, overrides the default timeout for this one 211s request. It may be a float (in seconds) or an instance of 211s :class:`urllib3.util.Timeout`. 211s 211s :param pool_timeout: 211s If set and the pool is set to block=True, then this method will 211s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 211s connection is available within the time period. 211s 211s :param bool preload_content: 211s If True, the response's body will be preloaded into memory. 211s 211s :param bool decode_content: 211s If True, will attempt to decode the body based on the 211s 'content-encoding' header. 211s 211s :param release_conn: 211s If False, then the urlopen call will not release the connection 211s back into the pool once a response is received (but will release if 211s you read the entire contents of the response such as when 211s `preload_content=True`). This is useful if you're not preloading 211s the response's content immediately. You will need to call 211s ``r.release_conn()`` on the response ``r`` to return the connection 211s back into the pool. If None, it takes the value of ``preload_content`` 211s which defaults to ``True``. 211s 211s :param bool chunked: 211s If True, urllib3 will send the body using chunked transfer 211s encoding. Otherwise, urllib3 will send the body using the standard 211s content-length form. Defaults to False. 211s 211s :param int body_pos: 211s Position to seek to in file-like body in the event of a retry or 211s redirect. Typically this won't need to be set because urllib3 will 211s auto-populate the value when needed. 211s """ 211s parsed_url = parse_url(url) 211s destination_scheme = parsed_url.scheme 211s 211s if headers is None: 211s headers = self.headers 211s 211s if not isinstance(retries, Retry): 211s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 211s 211s if release_conn is None: 211s release_conn = preload_content 211s 211s # Check host 211s if assert_same_host and not self.is_same_host(url): 211s raise HostChangedError(self, url, retries) 211s 211s # Ensure that the URL we're connecting to is properly encoded 211s if url.startswith("/"): 211s url = to_str(_encode_target(url)) 211s else: 211s url = to_str(parsed_url.url) 211s 211s conn = None 211s 211s # Track whether `conn` needs to be released before 211s # returning/raising/recursing. Update this variable if necessary, and 211s # leave `release_conn` constant throughout the function. That way, if 211s # the function recurses, the original value of `release_conn` will be 211s # passed down into the recursive call, and its value will be respected. 211s # 211s # See issue #651 [1] for details. 211s # 211s # [1] 211s release_this_conn = release_conn 211s 211s http_tunnel_required = connection_requires_http_tunnel( 211s self.proxy, self.proxy_config, destination_scheme 211s ) 211s 211s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 211s # have to copy the headers dict so we can safely change it without those 211s # changes being reflected in anyone else's copy. 211s if not http_tunnel_required: 211s headers = headers.copy() # type: ignore[attr-defined] 211s headers.update(self.proxy_headers) # type: ignore[union-attr] 211s 211s # Must keep the exception bound to a separate variable or else Python 3 211s # complains about UnboundLocalError. 211s err = None 211s 211s # Keep track of whether we cleanly exited the except block. This 211s # ensures we do proper cleanup in finally. 211s clean_exit = False 211s 211s # Rewind body position, if needed. Record current position 211s # for future rewinds in the event of a redirect/retry. 211s body_pos = set_file_position(body, body_pos) 211s 211s try: 211s # Request a connection from the queue. 211s timeout_obj = self._get_timeout(timeout) 211s conn = self._get_conn(timeout=pool_timeout) 211s 211s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 211s 211s # Is this a closed/new connection that requires CONNECT tunnelling? 211s if self.proxy is not None and http_tunnel_required and conn.is_closed: 211s try: 211s self._prepare_proxy(conn) 211s except (BaseSSLError, OSError, SocketTimeout) as e: 211s self._raise_timeout( 211s err=e, url=self.proxy.url, timeout_value=conn.timeout 211s ) 211s raise 211s 211s # If we're going to release the connection in ``finally:``, then 211s # the response doesn't need to know about the connection. Otherwise 211s # it will also try to release it and we'll have a double-release 211s # mess. 211s response_conn = conn if not release_conn else None 211s 211s # Make the request on the HTTPConnection object 211s > response = self._make_request( 211s conn, 211s method, 211s url, 211s timeout=timeout_obj, 211s body=body, 211s headers=headers, 211s chunked=chunked, 211s retries=retries, 211s response_conn=response_conn, 211s preload_content=preload_content, 211s decode_content=decode_content, 211s **response_kw, 211s ) 211s 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 211s conn.request( 211s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 211s self.endheaders() 211s /usr/lib/python3.12/http/client.py:1331: in endheaders 211s self._send_output(message_body, encode_chunked=encode_chunked) 211s /usr/lib/python3.12/http/client.py:1091: in _send_output 211s self.send(msg) 211s /usr/lib/python3.12/http/client.py:1035: in send 211s self.connect() 211s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 211s self.sock = self._new_conn() 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s 211s def _new_conn(self) -> socket.socket: 211s """Establish a socket connection and set nodelay settings on it. 211s 211s :return: New socket connection. 211s """ 211s try: 211s sock = connection.create_connection( 211s (self._dns_host, self.port), 211s self.timeout, 211s source_address=self.source_address, 211s socket_options=self.socket_options, 211s ) 211s except socket.gaierror as e: 211s raise NameResolutionError(self.host, self, e) from e 211s except SocketTimeout as e: 211s raise ConnectTimeoutError( 211s self, 211s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 211s ) from e 211s 211s except OSError as e: 211s > raise NewConnectionError( 211s self, f"Failed to establish a new connection: {e}" 211s ) from e 211s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 211s 211s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 211s 211s The above exception was the direct cause of the following exception: 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 211s :param request: The :class:`PreparedRequest ` being sent. 211s :param stream: (optional) Whether to stream the request content. 211s :param timeout: (optional) How long to wait for the server to send 211s data before giving up, as a float, or a :ref:`(connect timeout, 211s read timeout) ` tuple. 211s :type timeout: float or tuple or urllib3 Timeout object 211s :param verify: (optional) Either a boolean, in which case it controls whether 211s we verify the server's TLS certificate, or a string, in which case it 211s must be a path to a CA bundle to use 211s :param cert: (optional) Any user-provided SSL certificate to be trusted. 211s :param proxies: (optional) The proxies dictionary to apply to the request. 211s :rtype: requests.Response 211s """ 211s 211s try: 211s conn = self.get_connection(request.url, proxies) 211s except LocationValueError as e: 211s raise InvalidURL(e, request=request) 211s 211s self.cert_verify(conn, request.url, verify, cert) 211s url = self.request_url(request, proxies) 211s self.add_headers( 211s request, 211s stream=stream, 211s timeout=timeout, 211s verify=verify, 211s cert=cert, 211s proxies=proxies, 211s ) 211s 211s chunked = not (request.body is None or "Content-Length" in request.headers) 211s 211s if isinstance(timeout, tuple): 211s try: 211s connect, read = timeout 211s timeout = TimeoutSauce(connect=connect, read=read) 211s except ValueError: 211s raise ValueError( 211s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 211s f"or a single float to set both timeouts to the same value." 211s ) 211s elif isinstance(timeout, TimeoutSauce): 211s pass 211s else: 211s timeout = TimeoutSauce(connect=timeout, read=timeout) 211s 211s try: 211s > resp = conn.urlopen( 211s method=request.method, 211s url=url, 211s body=request.body, 211s headers=request.headers, 211s redirect=False, 211s assert_same_host=False, 211s preload_content=False, 211s decode_content=False, 211s retries=self.max_retries, 211s timeout=timeout, 211s chunked=chunked, 211s ) 211s 211s /usr/lib/python3/dist-packages/requests/adapters.py:486: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 211s retries = retries.increment( 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 211s method = 'GET', url = '/a%40b/api/contents', response = None 211s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 211s _pool = 211s _stacktrace = 211s 211s def increment( 211s self, 211s method: str | None = None, 211s url: str | None = None, 211s response: BaseHTTPResponse | None = None, 211s error: Exception | None = None, 211s _pool: ConnectionPool | None = None, 211s _stacktrace: TracebackType | None = None, 211s ) -> Retry: 211s """Return a new Retry object with incremented retry counters. 211s 211s :param response: A response object, or None, if the server did not 211s return a response. 211s :type response: :class:`~urllib3.response.BaseHTTPResponse` 211s :param Exception error: An error encountered during the request, or 211s None if the response was received successfully. 211s 211s :return: A new ``Retry`` object. 211s """ 211s if self.total is False and error: 211s # Disabled, indicate to re-raise the error. 211s raise reraise(type(error), error, _stacktrace) 211s 211s total = self.total 211s if total is not None: 211s total -= 1 211s 211s connect = self.connect 211s read = self.read 211s redirect = self.redirect 211s status_count = self.status 211s other = self.other 211s cause = "unknown" 211s status = None 211s redirect_location = None 211s 211s if error and self._is_connection_error(error): 211s # Connect retry? 211s if connect is False: 211s raise reraise(type(error), error, _stacktrace) 211s elif connect is not None: 211s connect -= 1 211s 211s elif error and self._is_read_error(error): 211s # Read retry? 211s if read is False or method is None or not self._is_method_retryable(method): 211s raise reraise(type(error), error, _stacktrace) 211s elif read is not None: 211s read -= 1 211s 211s elif error: 211s # Other retry? 211s if other is not None: 211s other -= 1 211s 211s elif response and response.get_redirect_location(): 211s # Redirect retry? 211s if redirect is not None: 211s redirect -= 1 211s cause = "too many redirects" 211s response_redirect_location = response.get_redirect_location() 211s if response_redirect_location: 211s redirect_location = response_redirect_location 211s status = response.status 211s 211s else: 211s # Incrementing because of a server error like a 500 in 211s # status_forcelist and the given method is in the allowed_methods 211s cause = ResponseError.GENERIC_ERROR 211s if response and response.status: 211s if status_count is not None: 211s status_count -= 1 211s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 211s status = response.status 211s 211s history = self.history + ( 211s RequestHistory(method, url, error, status, redirect_location), 211s ) 211s 211s new_retry = self.new( 211s total=total, 211s connect=connect, 211s read=read, 211s redirect=redirect, 211s status=status_count, 211s other=other, 211s history=history, 211s ) 211s 211s if new_retry.is_exhausted(): 211s reason = error or ResponseError(cause) 211s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 211s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 211s 211s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 211s 211s During handling of the above exception, another exception occurred: 211s 211s cls = 211s 211s @classmethod 211s def wait_until_alive(cls): 211s """Wait for the server to be alive""" 211s url = cls.base_url() + 'api/contents' 211s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 211s try: 211s > cls.fetch_url(url) 211s 211s notebook/tests/launchnotebook.py:53: 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s notebook/tests/launchnotebook.py:82: in fetch_url 211s return requests.get(url) 211s /usr/lib/python3/dist-packages/requests/api.py:73: in get 211s return request("get", url, params=params, **kwargs) 211s /usr/lib/python3/dist-packages/requests/api.py:59: in request 211s return session.request(method=method, url=url, **kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 211s resp = self.send(prep, **send_kwargs) 211s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 211s r = adapter.send(request, **kwargs) 211s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 211s 211s self = 211s request = , stream = False 211s timeout = Timeout(connect=None, read=None, total=None), verify = True 211s cert = None, proxies = OrderedDict() 211s 211s def send( 211s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 211s ): 211s """Sends PreparedRequest object. Returns Response object. 211s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s if not async_testing_enabled: # Can be removed once jupyter_client >= 6.1 is required. 212s raise SkipTest("AsyncSessionAPITest tests skipped due to down-level jupyter_client!") 212s > super().setup_class() 212s 212s notebook/services/sessions/tests/test_sessions_api.py:274: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:198: in setup_class 212s cls.wait_until_alive() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s _________ ERROR at setup of AsyncSessionAPITest.test_create_deprecated _________ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s if not async_testing_enabled: # Can be removed once jupyter_client >= 6.1 is required. 212s raise SkipTest("AsyncSessionAPITest tests skipped due to down-level jupyter_client!") 212s > super().setup_class() 212s 212s notebook/services/sessions/tests/test_sessions_api.py:274: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:198: in setup_class 212s cls.wait_until_alive() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s ________ ERROR at setup of AsyncSessionAPITest.test_create_file_session ________ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s if not async_testing_enabled: # Can be removed once jupyter_client >= 6.1 is required. 212s raise SkipTest("AsyncSessionAPITest tests skipped due to down-level jupyter_client!") 212s > super().setup_class() 212s 212s notebook/services/sessions/tests/test_sessions_api.py:274: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:198: in setup_class 212s cls.wait_until_alive() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s _______ ERROR at setup of AsyncSessionAPITest.test_create_with_kernel_id _______ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s if not async_testing_enabled: # Can be removed once jupyter_client >= 6.1 is required. 212s raise SkipTest("AsyncSessionAPITest tests skipped due to down-level jupyter_client!") 212s > super().setup_class() 212s 212s notebook/services/sessions/tests/test_sessions_api.py:274: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:198: in setup_class 212s cls.wait_until_alive() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s ______________ ERROR at setup of AsyncSessionAPITest.test_delete _______________ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s if not async_testing_enabled: # Can be removed once jupyter_client >= 6.1 is required. 212s raise SkipTest("AsyncSessionAPITest tests skipped due to down-level jupyter_client!") 212s > super().setup_class() 212s 212s notebook/services/sessions/tests/test_sessions_api.py:274: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:198: in setup_class 212s cls.wait_until_alive() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s _________ ERROR at setup of AsyncSessionAPITest.test_modify_kernel_id __________ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s if not async_testing_enabled: # Can be removed once jupyter_client >= 6.1 is required. 212s raise SkipTest("AsyncSessionAPITest tests skipped due to down-level jupyter_client!") 212s > super().setup_class() 212s 212s notebook/services/sessions/tests/test_sessions_api.py:274: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:198: in setup_class 212s cls.wait_until_alive() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s ________ ERROR at setup of AsyncSessionAPITest.test_modify_kernel_name _________ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s if not async_testing_enabled: # Can be removed once jupyter_client >= 6.1 is required. 212s raise SkipTest("AsyncSessionAPITest tests skipped due to down-level jupyter_client!") 212s > super().setup_class() 212s 212s notebook/services/sessions/tests/test_sessions_api.py:274: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:198: in setup_class 212s cls.wait_until_alive() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s ____________ ERROR at setup of AsyncSessionAPITest.test_modify_path ____________ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s if not async_testing_enabled: # Can be removed once jupyter_client >= 6.1 is required. 212s raise SkipTest("AsyncSessionAPITest tests skipped due to down-level jupyter_client!") 212s > super().setup_class() 212s 212s notebook/services/sessions/tests/test_sessions_api.py:274: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:198: in setup_class 212s cls.wait_until_alive() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s ______ ERROR at setup of AsyncSessionAPITest.test_modify_path_deprecated _______ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s if not async_testing_enabled: # Can be removed once jupyter_client >= 6.1 is required. 212s raise SkipTest("AsyncSessionAPITest tests skipped due to down-level jupyter_client!") 212s > super().setup_class() 212s 212s notebook/services/sessions/tests/test_sessions_api.py:274: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:198: in setup_class 212s cls.wait_until_alive() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s ____________ ERROR at setup of AsyncSessionAPITest.test_modify_type ____________ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s if not async_testing_enabled: # Can be removed once jupyter_client >= 6.1 is required. 212s raise SkipTest("AsyncSessionAPITest tests skipped due to down-level jupyter_client!") 212s > super().setup_class() 212s 212s notebook/services/sessions/tests/test_sessions_api.py:274: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:198: in setup_class 212s cls.wait_until_alive() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s ____________ ERROR at setup of TerminalAPITest.test_create_terminal ____________ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s cls.tmp_dir = TemporaryDirectory() 212s def tmp(*parts): 212s path = os.path.join(cls.tmp_dir.name, *parts) 212s try: 212s os.makedirs(path) 212s except OSError as e: 212s if e.errno != errno.EEXIST: 212s raise 212s return path 212s 212s cls.home_dir = tmp('home') 212s data_dir = cls.data_dir = tmp('data') 212s config_dir = cls.config_dir = tmp('config') 212s runtime_dir = cls.runtime_dir = tmp('runtime') 212s cls.notebook_dir = tmp('notebooks') 212s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 212s cls.env_patch.start() 212s # Patch systemwide & user-wide data & config directories, to isolate 212s # the tests from oddities of the local setup. But leave Python env 212s # locations alone, so data files for e.g. nbconvert are accessible. 212s # If this isolation isn't sufficient, you may need to run the tests in 212s # a virtualenv or conda env. 212s cls.path_patch = patch.multiple( 212s jupyter_core.paths, 212s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 212s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 212s ) 212s cls.path_patch.start() 212s 212s config = cls.config or Config() 212s config.NotebookNotary.db_file = ':memory:' 212s 212s cls.token = hexlify(os.urandom(4)).decode('ascii') 212s 212s started = Event() 212s def start_thread(): 212s try: 212s bind_args = cls.get_bind_args() 212s app = cls.notebook = NotebookApp( 212s port_retries=0, 212s open_browser=False, 212s config_dir=cls.config_dir, 212s data_dir=cls.data_dir, 212s runtime_dir=cls.runtime_dir, 212s notebook_dir=cls.notebook_dir, 212s base_url=cls.url_prefix, 212s config=config, 212s allow_root=True, 212s token=cls.token, 212s **bind_args 212s ) 212s if "asyncio" in sys.modules: 212s app._init_asyncio_patch() 212s import asyncio 212s 212s asyncio.set_event_loop(asyncio.new_event_loop()) 212s # Patch the current loop in order to match production 212s # behavior 212s import nest_asyncio 212s 212s nest_asyncio.apply() 212s # don't register signal handler during tests 212s app.init_signal = lambda : None 212s # clear log handlers and propagate to root for nose to capture it 212s # needs to be redone after initialize, which reconfigures logging 212s app.log.propagate = True 212s app.log.handlers = [] 212s app.initialize(argv=cls.get_argv()) 212s app.log.propagate = True 212s app.log.handlers = [] 212s loop = IOLoop.current() 212s loop.add_callback(started.set) 212s app.start() 212s finally: 212s # set the event, so failure to start doesn't cause a hang 212s started.set() 212s app.session_manager.close() 212s cls.notebook_thread = Thread(target=start_thread) 212s cls.notebook_thread.daemon = True 212s cls.notebook_thread.start() 212s started.wait() 212s > cls.wait_until_alive() 212s 212s notebook/tests/launchnotebook.py:198: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s ________ ERROR at setup of TerminalAPITest.test_create_terminal_via_get ________ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s cls.tmp_dir = TemporaryDirectory() 212s def tmp(*parts): 212s path = os.path.join(cls.tmp_dir.name, *parts) 212s try: 212s os.makedirs(path) 212s except OSError as e: 212s if e.errno != errno.EEXIST: 212s raise 212s return path 212s 212s cls.home_dir = tmp('home') 212s data_dir = cls.data_dir = tmp('data') 212s config_dir = cls.config_dir = tmp('config') 212s runtime_dir = cls.runtime_dir = tmp('runtime') 212s cls.notebook_dir = tmp('notebooks') 212s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 212s cls.env_patch.start() 212s # Patch systemwide & user-wide data & config directories, to isolate 212s # the tests from oddities of the local setup. But leave Python env 212s # locations alone, so data files for e.g. nbconvert are accessible. 212s # If this isolation isn't sufficient, you may need to run the tests in 212s # a virtualenv or conda env. 212s cls.path_patch = patch.multiple( 212s jupyter_core.paths, 212s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 212s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 212s ) 212s cls.path_patch.start() 212s 212s config = cls.config or Config() 212s config.NotebookNotary.db_file = ':memory:' 212s 212s cls.token = hexlify(os.urandom(4)).decode('ascii') 212s 212s started = Event() 212s def start_thread(): 212s try: 212s bind_args = cls.get_bind_args() 212s app = cls.notebook = NotebookApp( 212s port_retries=0, 212s open_browser=False, 212s config_dir=cls.config_dir, 212s data_dir=cls.data_dir, 212s runtime_dir=cls.runtime_dir, 212s notebook_dir=cls.notebook_dir, 212s base_url=cls.url_prefix, 212s config=config, 212s allow_root=True, 212s token=cls.token, 212s **bind_args 212s ) 212s if "asyncio" in sys.modules: 212s app._init_asyncio_patch() 212s import asyncio 212s 212s asyncio.set_event_loop(asyncio.new_event_loop()) 212s # Patch the current loop in order to match production 212s # behavior 212s import nest_asyncio 212s 212s nest_asyncio.apply() 212s # don't register signal handler during tests 212s app.init_signal = lambda : None 212s # clear log handlers and propagate to root for nose to capture it 212s # needs to be redone after initialize, which reconfigures logging 212s app.log.propagate = True 212s app.log.handlers = [] 212s app.initialize(argv=cls.get_argv()) 212s app.log.propagate = True 212s app.log.handlers = [] 212s loop = IOLoop.current() 212s loop.add_callback(started.set) 212s app.start() 212s finally: 212s # set the event, so failure to start doesn't cause a hang 212s started.set() 212s app.session_manager.close() 212s cls.notebook_thread = Thread(target=start_thread) 212s cls.notebook_thread.daemon = True 212s cls.notebook_thread.start() 212s started.wait() 212s > cls.wait_until_alive() 212s 212s notebook/tests/launchnotebook.py:198: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s _______ ERROR at setup of TerminalAPITest.test_create_terminal_with_name _______ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s cls.tmp_dir = TemporaryDirectory() 212s def tmp(*parts): 212s path = os.path.join(cls.tmp_dir.name, *parts) 212s try: 212s os.makedirs(path) 212s except OSError as e: 212s if e.errno != errno.EEXIST: 212s raise 212s return path 212s 212s cls.home_dir = tmp('home') 212s data_dir = cls.data_dir = tmp('data') 212s config_dir = cls.config_dir = tmp('config') 212s runtime_dir = cls.runtime_dir = tmp('runtime') 212s cls.notebook_dir = tmp('notebooks') 212s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 212s cls.env_patch.start() 212s # Patch systemwide & user-wide data & config directories, to isolate 212s # the tests from oddities of the local setup. But leave Python env 212s # locations alone, so data files for e.g. nbconvert are accessible. 212s # If this isolation isn't sufficient, you may need to run the tests in 212s # a virtualenv or conda env. 212s cls.path_patch = patch.multiple( 212s jupyter_core.paths, 212s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 212s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 212s ) 212s cls.path_patch.start() 212s 212s config = cls.config or Config() 212s config.NotebookNotary.db_file = ':memory:' 212s 212s cls.token = hexlify(os.urandom(4)).decode('ascii') 212s 212s started = Event() 212s def start_thread(): 212s try: 212s bind_args = cls.get_bind_args() 212s app = cls.notebook = NotebookApp( 212s port_retries=0, 212s open_browser=False, 212s config_dir=cls.config_dir, 212s data_dir=cls.data_dir, 212s runtime_dir=cls.runtime_dir, 212s notebook_dir=cls.notebook_dir, 212s base_url=cls.url_prefix, 212s config=config, 212s allow_root=True, 212s token=cls.token, 212s **bind_args 212s ) 212s if "asyncio" in sys.modules: 212s app._init_asyncio_patch() 212s import asyncio 212s 212s asyncio.set_event_loop(asyncio.new_event_loop()) 212s # Patch the current loop in order to match production 212s # behavior 212s import nest_asyncio 212s 212s nest_asyncio.apply() 212s # don't register signal handler during tests 212s app.init_signal = lambda : None 212s # clear log handlers and propagate to root for nose to capture it 212s # needs to be redone after initialize, which reconfigures logging 212s app.log.propagate = True 212s app.log.handlers = [] 212s app.initialize(argv=cls.get_argv()) 212s app.log.propagate = True 212s app.log.handlers = [] 212s loop = IOLoop.current() 212s loop.add_callback(started.set) 212s app.start() 212s finally: 212s # set the event, so failure to start doesn't cause a hang 212s started.set() 212s app.session_manager.close() 212s cls.notebook_thread = Thread(target=start_thread) 212s cls.notebook_thread.daemon = True 212s cls.notebook_thread.start() 212s started.wait() 212s > cls.wait_until_alive() 212s 212s notebook/tests/launchnotebook.py:198: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s _____________ ERROR at setup of TerminalAPITest.test_no_terminals ______________ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s cls.tmp_dir = TemporaryDirectory() 212s def tmp(*parts): 212s path = os.path.join(cls.tmp_dir.name, *parts) 212s try: 212s os.makedirs(path) 212s except OSError as e: 212s if e.errno != errno.EEXIST: 212s raise 212s return path 212s 212s cls.home_dir = tmp('home') 212s data_dir = cls.data_dir = tmp('data') 212s config_dir = cls.config_dir = tmp('config') 212s runtime_dir = cls.runtime_dir = tmp('runtime') 212s cls.notebook_dir = tmp('notebooks') 212s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 212s cls.env_patch.start() 212s # Patch systemwide & user-wide data & config directories, to isolate 212s # the tests from oddities of the local setup. But leave Python env 212s # locations alone, so data files for e.g. nbconvert are accessible. 212s # If this isolation isn't sufficient, you may need to run the tests in 212s # a virtualenv or conda env. 212s cls.path_patch = patch.multiple( 212s jupyter_core.paths, 212s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 212s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 212s ) 212s cls.path_patch.start() 212s 212s config = cls.config or Config() 212s config.NotebookNotary.db_file = ':memory:' 212s 212s cls.token = hexlify(os.urandom(4)).decode('ascii') 212s 212s started = Event() 212s def start_thread(): 212s try: 212s bind_args = cls.get_bind_args() 212s app = cls.notebook = NotebookApp( 212s port_retries=0, 212s open_browser=False, 212s config_dir=cls.config_dir, 212s data_dir=cls.data_dir, 212s runtime_dir=cls.runtime_dir, 212s notebook_dir=cls.notebook_dir, 212s base_url=cls.url_prefix, 212s config=config, 212s allow_root=True, 212s token=cls.token, 212s **bind_args 212s ) 212s if "asyncio" in sys.modules: 212s app._init_asyncio_patch() 212s import asyncio 212s 212s asyncio.set_event_loop(asyncio.new_event_loop()) 212s # Patch the current loop in order to match production 212s # behavior 212s import nest_asyncio 212s 212s nest_asyncio.apply() 212s # don't register signal handler during tests 212s app.init_signal = lambda : None 212s # clear log handlers and propagate to root for nose to capture it 212s # needs to be redone after initialize, which reconfigures logging 212s app.log.propagate = True 212s app.log.handlers = [] 212s app.initialize(argv=cls.get_argv()) 212s app.log.propagate = True 212s app.log.handlers = [] 212s loop = IOLoop.current() 212s loop.add_callback(started.set) 212s app.start() 212s finally: 212s # set the event, so failure to start doesn't cause a hang 212s started.set() 212s app.session_manager.close() 212s cls.notebook_thread = Thread(target=start_thread) 212s cls.notebook_thread.daemon = True 212s cls.notebook_thread.start() 212s started.wait() 212s > cls.wait_until_alive() 212s 212s notebook/tests/launchnotebook.py:198: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s ___________ ERROR at setup of TerminalAPITest.test_terminal_handler ____________ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s cls.tmp_dir = TemporaryDirectory() 212s def tmp(*parts): 212s path = os.path.join(cls.tmp_dir.name, *parts) 212s try: 212s os.makedirs(path) 212s except OSError as e: 212s if e.errno != errno.EEXIST: 212s raise 212s return path 212s 212s cls.home_dir = tmp('home') 212s data_dir = cls.data_dir = tmp('data') 212s config_dir = cls.config_dir = tmp('config') 212s runtime_dir = cls.runtime_dir = tmp('runtime') 212s cls.notebook_dir = tmp('notebooks') 212s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 212s cls.env_patch.start() 212s # Patch systemwide & user-wide data & config directories, to isolate 212s # the tests from oddities of the local setup. But leave Python env 212s # locations alone, so data files for e.g. nbconvert are accessible. 212s # If this isolation isn't sufficient, you may need to run the tests in 212s # a virtualenv or conda env. 212s cls.path_patch = patch.multiple( 212s jupyter_core.paths, 212s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 212s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 212s ) 212s cls.path_patch.start() 212s 212s config = cls.config or Config() 212s config.NotebookNotary.db_file = ':memory:' 212s 212s cls.token = hexlify(os.urandom(4)).decode('ascii') 212s 212s started = Event() 212s def start_thread(): 212s try: 212s bind_args = cls.get_bind_args() 212s app = cls.notebook = NotebookApp( 212s port_retries=0, 212s open_browser=False, 212s config_dir=cls.config_dir, 212s data_dir=cls.data_dir, 212s runtime_dir=cls.runtime_dir, 212s notebook_dir=cls.notebook_dir, 212s base_url=cls.url_prefix, 212s config=config, 212s allow_root=True, 212s token=cls.token, 212s **bind_args 212s ) 212s if "asyncio" in sys.modules: 212s app._init_asyncio_patch() 212s import asyncio 212s 212s asyncio.set_event_loop(asyncio.new_event_loop()) 212s # Patch the current loop in order to match production 212s # behavior 212s import nest_asyncio 212s 212s nest_asyncio.apply() 212s # don't register signal handler during tests 212s app.init_signal = lambda : None 212s # clear log handlers and propagate to root for nose to capture it 212s # needs to be redone after initialize, which reconfigures logging 212s app.log.propagate = True 212s app.log.handlers = [] 212s app.initialize(argv=cls.get_argv()) 212s app.log.propagate = True 212s app.log.handlers = [] 212s loop = IOLoop.current() 212s loop.add_callback(started.set) 212s app.start() 212s finally: 212s # set the event, so failure to start doesn't cause a hang 212s started.set() 212s app.session_manager.close() 212s cls.notebook_thread = Thread(target=start_thread) 212s cls.notebook_thread.daemon = True 212s cls.notebook_thread.start() 212s started.wait() 212s > cls.wait_until_alive() 212s 212s notebook/tests/launchnotebook.py:198: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s _________ ERROR at setup of TerminalAPITest.test_terminal_root_handler _________ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s cls.tmp_dir = TemporaryDirectory() 212s def tmp(*parts): 212s path = os.path.join(cls.tmp_dir.name, *parts) 212s try: 212s os.makedirs(path) 212s except OSError as e: 212s if e.errno != errno.EEXIST: 212s raise 212s return path 212s 212s cls.home_dir = tmp('home') 212s data_dir = cls.data_dir = tmp('data') 212s config_dir = cls.config_dir = tmp('config') 212s runtime_dir = cls.runtime_dir = tmp('runtime') 212s cls.notebook_dir = tmp('notebooks') 212s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 212s cls.env_patch.start() 212s # Patch systemwide & user-wide data & config directories, to isolate 212s # the tests from oddities of the local setup. But leave Python env 212s # locations alone, so data files for e.g. nbconvert are accessible. 212s # If this isolation isn't sufficient, you may need to run the tests in 212s # a virtualenv or conda env. 212s cls.path_patch = patch.multiple( 212s jupyter_core.paths, 212s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 212s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 212s ) 212s cls.path_patch.start() 212s 212s config = cls.config or Config() 212s config.NotebookNotary.db_file = ':memory:' 212s 212s cls.token = hexlify(os.urandom(4)).decode('ascii') 212s 212s started = Event() 212s def start_thread(): 212s try: 212s bind_args = cls.get_bind_args() 212s app = cls.notebook = NotebookApp( 212s port_retries=0, 212s open_browser=False, 212s config_dir=cls.config_dir, 212s data_dir=cls.data_dir, 212s runtime_dir=cls.runtime_dir, 212s notebook_dir=cls.notebook_dir, 212s base_url=cls.url_prefix, 212s config=config, 212s allow_root=True, 212s token=cls.token, 212s **bind_args 212s ) 212s if "asyncio" in sys.modules: 212s app._init_asyncio_patch() 212s import asyncio 212s 212s asyncio.set_event_loop(asyncio.new_event_loop()) 212s # Patch the current loop in order to match production 212s # behavior 212s import nest_asyncio 212s 212s nest_asyncio.apply() 212s # don't register signal handler during tests 212s app.init_signal = lambda : None 212s # clear log handlers and propagate to root for nose to capture it 212s # needs to be redone after initialize, which reconfigures logging 212s app.log.propagate = True 212s app.log.handlers = [] 212s app.initialize(argv=cls.get_argv()) 212s app.log.propagate = True 212s app.log.handlers = [] 212s loop = IOLoop.current() 212s loop.add_callback(started.set) 212s app.start() 212s finally: 212s # set the event, so failure to start doesn't cause a hang 212s started.set() 212s app.session_manager.close() 212s cls.notebook_thread = Thread(target=start_thread) 212s cls.notebook_thread.daemon = True 212s cls.notebook_thread.start() 212s started.wait() 212s > cls.wait_until_alive() 212s 212s notebook/tests/launchnotebook.py:198: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s ______________ ERROR at setup of TerminalCullingTest.test_config _______________ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s cls.tmp_dir = TemporaryDirectory() 212s def tmp(*parts): 212s path = os.path.join(cls.tmp_dir.name, *parts) 212s try: 212s os.makedirs(path) 212s except OSError as e: 212s if e.errno != errno.EEXIST: 212s raise 212s return path 212s 212s cls.home_dir = tmp('home') 212s data_dir = cls.data_dir = tmp('data') 212s config_dir = cls.config_dir = tmp('config') 212s runtime_dir = cls.runtime_dir = tmp('runtime') 212s cls.notebook_dir = tmp('notebooks') 212s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 212s cls.env_patch.start() 212s # Patch systemwide & user-wide data & config directories, to isolate 212s # the tests from oddities of the local setup. But leave Python env 212s # locations alone, so data files for e.g. nbconvert are accessible. 212s # If this isolation isn't sufficient, you may need to run the tests in 212s # a virtualenv or conda env. 212s cls.path_patch = patch.multiple( 212s jupyter_core.paths, 212s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 212s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 212s ) 212s cls.path_patch.start() 212s 212s config = cls.config or Config() 212s config.NotebookNotary.db_file = ':memory:' 212s 212s cls.token = hexlify(os.urandom(4)).decode('ascii') 212s 212s started = Event() 212s def start_thread(): 212s try: 212s bind_args = cls.get_bind_args() 212s app = cls.notebook = NotebookApp( 212s port_retries=0, 212s open_browser=False, 212s config_dir=cls.config_dir, 212s data_dir=cls.data_dir, 212s runtime_dir=cls.runtime_dir, 212s notebook_dir=cls.notebook_dir, 212s base_url=cls.url_prefix, 212s config=config, 212s allow_root=True, 212s token=cls.token, 212s **bind_args 212s ) 212s if "asyncio" in sys.modules: 212s app._init_asyncio_patch() 212s import asyncio 212s 212s asyncio.set_event_loop(asyncio.new_event_loop()) 212s # Patch the current loop in order to match production 212s # behavior 212s import nest_asyncio 212s 212s nest_asyncio.apply() 212s # don't register signal handler during tests 212s app.init_signal = lambda : None 212s # clear log handlers and propagate to root for nose to capture it 212s # needs to be redone after initialize, which reconfigures logging 212s app.log.propagate = True 212s app.log.handlers = [] 212s app.initialize(argv=cls.get_argv()) 212s app.log.propagate = True 212s app.log.handlers = [] 212s loop = IOLoop.current() 212s loop.add_callback(started.set) 212s app.start() 212s finally: 212s # set the event, so failure to start doesn't cause a hang 212s started.set() 212s app.session_manager.close() 212s cls.notebook_thread = Thread(target=start_thread) 212s cls.notebook_thread.daemon = True 212s cls.notebook_thread.start() 212s started.wait() 212s > cls.wait_until_alive() 212s 212s notebook/tests/launchnotebook.py:198: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s ______________ ERROR at setup of TerminalCullingTest.test_culling ______________ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s cls.tmp_dir = TemporaryDirectory() 212s def tmp(*parts): 212s path = os.path.join(cls.tmp_dir.name, *parts) 212s try: 212s os.makedirs(path) 212s except OSError as e: 212s if e.errno != errno.EEXIST: 212s raise 212s return path 212s 212s cls.home_dir = tmp('home') 212s data_dir = cls.data_dir = tmp('data') 212s config_dir = cls.config_dir = tmp('config') 212s runtime_dir = cls.runtime_dir = tmp('runtime') 212s cls.notebook_dir = tmp('notebooks') 212s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 212s cls.env_patch.start() 212s # Patch systemwide & user-wide data & config directories, to isolate 212s # the tests from oddities of the local setup. But leave Python env 212s # locations alone, so data files for e.g. nbconvert are accessible. 212s # If this isolation isn't sufficient, you may need to run the tests in 212s # a virtualenv or conda env. 212s cls.path_patch = patch.multiple( 212s jupyter_core.paths, 212s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 212s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 212s ) 212s cls.path_patch.start() 212s 212s config = cls.config or Config() 212s config.NotebookNotary.db_file = ':memory:' 212s 212s cls.token = hexlify(os.urandom(4)).decode('ascii') 212s 212s started = Event() 212s def start_thread(): 212s try: 212s bind_args = cls.get_bind_args() 212s app = cls.notebook = NotebookApp( 212s port_retries=0, 212s open_browser=False, 212s config_dir=cls.config_dir, 212s data_dir=cls.data_dir, 212s runtime_dir=cls.runtime_dir, 212s notebook_dir=cls.notebook_dir, 212s base_url=cls.url_prefix, 212s config=config, 212s allow_root=True, 212s token=cls.token, 212s **bind_args 212s ) 212s if "asyncio" in sys.modules: 212s app._init_asyncio_patch() 212s import asyncio 212s 212s asyncio.set_event_loop(asyncio.new_event_loop()) 212s # Patch the current loop in order to match production 212s # behavior 212s import nest_asyncio 212s 212s nest_asyncio.apply() 212s # don't register signal handler during tests 212s app.init_signal = lambda : None 212s # clear log handlers and propagate to root for nose to capture it 212s # needs to be redone after initialize, which reconfigures logging 212s app.log.propagate = True 212s app.log.handlers = [] 212s app.initialize(argv=cls.get_argv()) 212s app.log.propagate = True 212s app.log.handlers = [] 212s loop = IOLoop.current() 212s loop.add_callback(started.set) 212s app.start() 212s finally: 212s # set the event, so failure to start doesn't cause a hang 212s started.set() 212s app.session_manager.close() 212s cls.notebook_thread = Thread(target=start_thread) 212s cls.notebook_thread.daemon = True 212s cls.notebook_thread.start() 212s started.wait() 212s > cls.wait_until_alive() 212s 212s notebook/tests/launchnotebook.py:198: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s ______________ ERROR at setup of FilesTest.test_contents_manager _______________ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s cls.tmp_dir = TemporaryDirectory() 212s def tmp(*parts): 212s path = os.path.join(cls.tmp_dir.name, *parts) 212s try: 212s os.makedirs(path) 212s except OSError as e: 212s if e.errno != errno.EEXIST: 212s raise 212s return path 212s 212s cls.home_dir = tmp('home') 212s data_dir = cls.data_dir = tmp('data') 212s config_dir = cls.config_dir = tmp('config') 212s runtime_dir = cls.runtime_dir = tmp('runtime') 212s cls.notebook_dir = tmp('notebooks') 212s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 212s cls.env_patch.start() 212s # Patch systemwide & user-wide data & config directories, to isolate 212s # the tests from oddities of the local setup. But leave Python env 212s # locations alone, so data files for e.g. nbconvert are accessible. 212s # If this isolation isn't sufficient, you may need to run the tests in 212s # a virtualenv or conda env. 212s cls.path_patch = patch.multiple( 212s jupyter_core.paths, 212s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 212s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 212s ) 212s cls.path_patch.start() 212s 212s config = cls.config or Config() 212s config.NotebookNotary.db_file = ':memory:' 212s 212s cls.token = hexlify(os.urandom(4)).decode('ascii') 212s 212s started = Event() 212s def start_thread(): 212s try: 212s bind_args = cls.get_bind_args() 212s app = cls.notebook = NotebookApp( 212s port_retries=0, 212s open_browser=False, 212s config_dir=cls.config_dir, 212s data_dir=cls.data_dir, 212s runtime_dir=cls.runtime_dir, 212s notebook_dir=cls.notebook_dir, 212s base_url=cls.url_prefix, 212s config=config, 212s allow_root=True, 212s token=cls.token, 212s **bind_args 212s ) 212s if "asyncio" in sys.modules: 212s app._init_asyncio_patch() 212s import asyncio 212s 212s asyncio.set_event_loop(asyncio.new_event_loop()) 212s # Patch the current loop in order to match production 212s # behavior 212s import nest_asyncio 212s 212s nest_asyncio.apply() 212s # don't register signal handler during tests 212s app.init_signal = lambda : None 212s # clear log handlers and propagate to root for nose to capture it 212s # needs to be redone after initialize, which reconfigures logging 212s app.log.propagate = True 212s app.log.handlers = [] 212s app.initialize(argv=cls.get_argv()) 212s app.log.propagate = True 212s app.log.handlers = [] 212s loop = IOLoop.current() 212s loop.add_callback(started.set) 212s app.start() 212s finally: 212s # set the event, so failure to start doesn't cause a hang 212s started.set() 212s app.session_manager.close() 212s cls.notebook_thread = Thread(target=start_thread) 212s cls.notebook_thread.daemon = True 212s cls.notebook_thread.start() 212s started.wait() 212s > cls.wait_until_alive() 212s 212s notebook/tests/launchnotebook.py:198: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s __________________ ERROR at setup of FilesTest.test_download ___________________ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s cls.tmp_dir = TemporaryDirectory() 212s def tmp(*parts): 212s path = os.path.join(cls.tmp_dir.name, *parts) 212s try: 212s os.makedirs(path) 212s except OSError as e: 212s if e.errno != errno.EEXIST: 212s raise 212s return path 212s 212s cls.home_dir = tmp('home') 212s data_dir = cls.data_dir = tmp('data') 212s config_dir = cls.config_dir = tmp('config') 212s runtime_dir = cls.runtime_dir = tmp('runtime') 212s cls.notebook_dir = tmp('notebooks') 212s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 212s cls.env_patch.start() 212s # Patch systemwide & user-wide data & config directories, to isolate 212s # the tests from oddities of the local setup. But leave Python env 212s # locations alone, so data files for e.g. nbconvert are accessible. 212s # If this isolation isn't sufficient, you may need to run the tests in 212s # a virtualenv or conda env. 212s cls.path_patch = patch.multiple( 212s jupyter_core.paths, 212s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 212s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 212s ) 212s cls.path_patch.start() 212s 212s config = cls.config or Config() 212s config.NotebookNotary.db_file = ':memory:' 212s 212s cls.token = hexlify(os.urandom(4)).decode('ascii') 212s 212s started = Event() 212s def start_thread(): 212s try: 212s bind_args = cls.get_bind_args() 212s app = cls.notebook = NotebookApp( 212s port_retries=0, 212s open_browser=False, 212s config_dir=cls.config_dir, 212s data_dir=cls.data_dir, 212s runtime_dir=cls.runtime_dir, 212s notebook_dir=cls.notebook_dir, 212s base_url=cls.url_prefix, 212s config=config, 212s allow_root=True, 212s token=cls.token, 212s **bind_args 212s ) 212s if "asyncio" in sys.modules: 212s app._init_asyncio_patch() 212s import asyncio 212s 212s asyncio.set_event_loop(asyncio.new_event_loop()) 212s # Patch the current loop in order to match production 212s # behavior 212s import nest_asyncio 212s 212s nest_asyncio.apply() 212s # don't register signal handler during tests 212s app.init_signal = lambda : None 212s # clear log handlers and propagate to root for nose to capture it 212s # needs to be redone after initialize, which reconfigures logging 212s app.log.propagate = True 212s app.log.handlers = [] 212s app.initialize(argv=cls.get_argv()) 212s app.log.propagate = True 212s app.log.handlers = [] 212s loop = IOLoop.current() 212s loop.add_callback(started.set) 212s app.start() 212s finally: 212s # set the event, so failure to start doesn't cause a hang 212s started.set() 212s app.session_manager.close() 212s cls.notebook_thread = Thread(target=start_thread) 212s cls.notebook_thread.daemon = True 212s cls.notebook_thread.start() 212s started.wait() 212s > cls.wait_until_alive() 212s 212s notebook/tests/launchnotebook.py:198: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s ________________ ERROR at setup of FilesTest.test_hidden_files _________________ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s cls.tmp_dir = TemporaryDirectory() 212s def tmp(*parts): 212s path = os.path.join(cls.tmp_dir.name, *parts) 212s try: 212s os.makedirs(path) 212s except OSError as e: 212s if e.errno != errno.EEXIST: 212s raise 212s return path 212s 212s cls.home_dir = tmp('home') 212s data_dir = cls.data_dir = tmp('data') 212s config_dir = cls.config_dir = tmp('config') 212s runtime_dir = cls.runtime_dir = tmp('runtime') 212s cls.notebook_dir = tmp('notebooks') 212s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 212s cls.env_patch.start() 212s # Patch systemwide & user-wide data & config directories, to isolate 212s # the tests from oddities of the local setup. But leave Python env 212s # locations alone, so data files for e.g. nbconvert are accessible. 212s # If this isolation isn't sufficient, you may need to run the tests in 212s # a virtualenv or conda env. 212s cls.path_patch = patch.multiple( 212s jupyter_core.paths, 212s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 212s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 212s ) 212s cls.path_patch.start() 212s 212s config = cls.config or Config() 212s config.NotebookNotary.db_file = ':memory:' 212s 212s cls.token = hexlify(os.urandom(4)).decode('ascii') 212s 212s started = Event() 212s def start_thread(): 212s try: 212s bind_args = cls.get_bind_args() 212s app = cls.notebook = NotebookApp( 212s port_retries=0, 212s open_browser=False, 212s config_dir=cls.config_dir, 212s data_dir=cls.data_dir, 212s runtime_dir=cls.runtime_dir, 212s notebook_dir=cls.notebook_dir, 212s base_url=cls.url_prefix, 212s config=config, 212s allow_root=True, 212s token=cls.token, 212s **bind_args 212s ) 212s if "asyncio" in sys.modules: 212s app._init_asyncio_patch() 212s import asyncio 212s 212s asyncio.set_event_loop(asyncio.new_event_loop()) 212s # Patch the current loop in order to match production 212s # behavior 212s import nest_asyncio 212s 212s nest_asyncio.apply() 212s # don't register signal handler during tests 212s app.init_signal = lambda : None 212s # clear log handlers and propagate to root for nose to capture it 212s # needs to be redone after initialize, which reconfigures logging 212s app.log.propagate = True 212s app.log.handlers = [] 212s app.initialize(argv=cls.get_argv()) 212s app.log.propagate = True 212s app.log.handlers = [] 212s loop = IOLoop.current() 212s loop.add_callback(started.set) 212s app.start() 212s finally: 212s # set the event, so failure to start doesn't cause a hang 212s started.set() 212s app.session_manager.close() 212s cls.notebook_thread = Thread(target=start_thread) 212s cls.notebook_thread.daemon = True 212s cls.notebook_thread.start() 212s started.wait() 212s > cls.wait_until_alive() 212s 212s notebook/tests/launchnotebook.py:198: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s _____________ ERROR at setup of FilesTest.test_old_files_redirect ______________ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s cls.tmp_dir = TemporaryDirectory() 212s def tmp(*parts): 212s path = os.path.join(cls.tmp_dir.name, *parts) 212s try: 212s os.makedirs(path) 212s except OSError as e: 212s if e.errno != errno.EEXIST: 212s raise 212s return path 212s 212s cls.home_dir = tmp('home') 212s data_dir = cls.data_dir = tmp('data') 212s config_dir = cls.config_dir = tmp('config') 212s runtime_dir = cls.runtime_dir = tmp('runtime') 212s cls.notebook_dir = tmp('notebooks') 212s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 212s cls.env_patch.start() 212s # Patch systemwide & user-wide data & config directories, to isolate 212s # the tests from oddities of the local setup. But leave Python env 212s # locations alone, so data files for e.g. nbconvert are accessible. 212s # If this isolation isn't sufficient, you may need to run the tests in 212s # a virtualenv or conda env. 212s cls.path_patch = patch.multiple( 212s jupyter_core.paths, 212s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 212s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 212s ) 212s cls.path_patch.start() 212s 212s config = cls.config or Config() 212s config.NotebookNotary.db_file = ':memory:' 212s 212s cls.token = hexlify(os.urandom(4)).decode('ascii') 212s 212s started = Event() 212s def start_thread(): 212s try: 212s bind_args = cls.get_bind_args() 212s app = cls.notebook = NotebookApp( 212s port_retries=0, 212s open_browser=False, 212s config_dir=cls.config_dir, 212s data_dir=cls.data_dir, 212s runtime_dir=cls.runtime_dir, 212s notebook_dir=cls.notebook_dir, 212s base_url=cls.url_prefix, 212s config=config, 212s allow_root=True, 212s token=cls.token, 212s **bind_args 212s ) 212s if "asyncio" in sys.modules: 212s app._init_asyncio_patch() 212s import asyncio 212s 212s asyncio.set_event_loop(asyncio.new_event_loop()) 212s # Patch the current loop in order to match production 212s # behavior 212s import nest_asyncio 212s 212s nest_asyncio.apply() 212s # don't register signal handler during tests 212s app.init_signal = lambda : None 212s # clear log handlers and propagate to root for nose to capture it 212s # needs to be redone after initialize, which reconfigures logging 212s app.log.propagate = True 212s app.log.handlers = [] 212s app.initialize(argv=cls.get_argv()) 212s app.log.propagate = True 212s app.log.handlers = [] 212s loop = IOLoop.current() 212s loop.add_callback(started.set) 212s app.start() 212s finally: 212s # set the event, so failure to start doesn't cause a hang 212s started.set() 212s app.session_manager.close() 212s cls.notebook_thread = Thread(target=start_thread) 212s cls.notebook_thread.daemon = True 212s cls.notebook_thread.start() 212s started.wait() 212s > cls.wait_until_alive() 212s 212s notebook/tests/launchnotebook.py:198: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s __________________ ERROR at setup of FilesTest.test_view_html __________________ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s cls.tmp_dir = TemporaryDirectory() 212s def tmp(*parts): 212s path = os.path.join(cls.tmp_dir.name, *parts) 212s try: 212s os.makedirs(path) 212s except OSError as e: 212s if e.errno != errno.EEXIST: 212s raise 212s return path 212s 212s cls.home_dir = tmp('home') 212s data_dir = cls.data_dir = tmp('data') 212s config_dir = cls.config_dir = tmp('config') 212s runtime_dir = cls.runtime_dir = tmp('runtime') 212s cls.notebook_dir = tmp('notebooks') 212s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 212s cls.env_patch.start() 212s # Patch systemwide & user-wide data & config directories, to isolate 212s # the tests from oddities of the local setup. But leave Python env 212s # locations alone, so data files for e.g. nbconvert are accessible. 212s # If this isolation isn't sufficient, you may need to run the tests in 212s # a virtualenv or conda env. 212s cls.path_patch = patch.multiple( 212s jupyter_core.paths, 212s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 212s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 212s ) 212s cls.path_patch.start() 212s 212s config = cls.config or Config() 212s config.NotebookNotary.db_file = ':memory:' 212s 212s cls.token = hexlify(os.urandom(4)).decode('ascii') 212s 212s started = Event() 212s def start_thread(): 212s try: 212s bind_args = cls.get_bind_args() 212s app = cls.notebook = NotebookApp( 212s port_retries=0, 212s open_browser=False, 212s config_dir=cls.config_dir, 212s data_dir=cls.data_dir, 212s runtime_dir=cls.runtime_dir, 212s notebook_dir=cls.notebook_dir, 212s base_url=cls.url_prefix, 212s config=config, 212s allow_root=True, 212s token=cls.token, 212s **bind_args 212s ) 212s if "asyncio" in sys.modules: 212s app._init_asyncio_patch() 212s import asyncio 212s 212s asyncio.set_event_loop(asyncio.new_event_loop()) 212s # Patch the current loop in order to match production 212s # behavior 212s import nest_asyncio 212s 212s nest_asyncio.apply() 212s # don't register signal handler during tests 212s app.init_signal = lambda : None 212s # clear log handlers and propagate to root for nose to capture it 212s # needs to be redone after initialize, which reconfigures logging 212s app.log.propagate = True 212s app.log.handlers = [] 212s app.initialize(argv=cls.get_argv()) 212s app.log.propagate = True 212s app.log.handlers = [] 212s loop = IOLoop.current() 212s loop.add_callback(started.set) 212s app.start() 212s finally: 212s # set the event, so failure to start doesn't cause a hang 212s started.set() 212s app.session_manager.close() 212s cls.notebook_thread = Thread(target=start_thread) 212s cls.notebook_thread.daemon = True 212s cls.notebook_thread.start() 212s started.wait() 212s > cls.wait_until_alive() 212s 212s notebook/tests/launchnotebook.py:198: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s __________ ERROR at setup of TestGateway.test_gateway_class_mappings ___________ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s GatewayClient.clear_instance() 212s > super().setup_class() 212s 212s notebook/tests/test_gateway.py:138: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:198: in setup_class 212s cls.wait_until_alive() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s __________ ERROR at setup of TestGateway.test_gateway_get_kernelspecs __________ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s GatewayClient.clear_instance() 212s > super().setup_class() 212s 212s notebook/tests/test_gateway.py:138: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:198: in setup_class 212s cls.wait_until_alive() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s _______ ERROR at setup of TestGateway.test_gateway_get_named_kernelspec ________ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s GatewayClient.clear_instance() 212s > super().setup_class() 212s 212s notebook/tests/test_gateway.py:138: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:198: in setup_class 212s cls.wait_until_alive() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s _________ ERROR at setup of TestGateway.test_gateway_kernel_lifecycle __________ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s GatewayClient.clear_instance() 212s > super().setup_class() 212s 212s notebook/tests/test_gateway.py:138: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:198: in setup_class 212s cls.wait_until_alive() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s ______________ ERROR at setup of TestGateway.test_gateway_options ______________ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s GatewayClient.clear_instance() 212s > super().setup_class() 212s 212s notebook/tests/test_gateway.py:138: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:198: in setup_class 212s cls.wait_until_alive() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s _________ ERROR at setup of TestGateway.test_gateway_session_lifecycle _________ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s GatewayClient.clear_instance() 212s > super().setup_class() 212s 212s notebook/tests/test_gateway.py:138: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:198: in setup_class 212s cls.wait_until_alive() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s _________ ERROR at setup of NotebookAppTests.test_list_running_servers _________ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s cls.tmp_dir = TemporaryDirectory() 212s def tmp(*parts): 212s path = os.path.join(cls.tmp_dir.name, *parts) 212s try: 212s os.makedirs(path) 212s except OSError as e: 212s if e.errno != errno.EEXIST: 212s raise 212s return path 212s 212s cls.home_dir = tmp('home') 212s data_dir = cls.data_dir = tmp('data') 212s config_dir = cls.config_dir = tmp('config') 212s runtime_dir = cls.runtime_dir = tmp('runtime') 212s cls.notebook_dir = tmp('notebooks') 212s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 212s cls.env_patch.start() 212s # Patch systemwide & user-wide data & config directories, to isolate 212s # the tests from oddities of the local setup. But leave Python env 212s # locations alone, so data files for e.g. nbconvert are accessible. 212s # If this isolation isn't sufficient, you may need to run the tests in 212s # a virtualenv or conda env. 212s cls.path_patch = patch.multiple( 212s jupyter_core.paths, 212s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 212s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 212s ) 212s cls.path_patch.start() 212s 212s config = cls.config or Config() 212s config.NotebookNotary.db_file = ':memory:' 212s 212s cls.token = hexlify(os.urandom(4)).decode('ascii') 212s 212s started = Event() 212s def start_thread(): 212s try: 212s bind_args = cls.get_bind_args() 212s app = cls.notebook = NotebookApp( 212s port_retries=0, 212s open_browser=False, 212s config_dir=cls.config_dir, 212s data_dir=cls.data_dir, 212s runtime_dir=cls.runtime_dir, 212s notebook_dir=cls.notebook_dir, 212s base_url=cls.url_prefix, 212s config=config, 212s allow_root=True, 212s token=cls.token, 212s **bind_args 212s ) 212s if "asyncio" in sys.modules: 212s app._init_asyncio_patch() 212s import asyncio 212s 212s asyncio.set_event_loop(asyncio.new_event_loop()) 212s # Patch the current loop in order to match production 212s # behavior 212s import nest_asyncio 212s 212s nest_asyncio.apply() 212s # don't register signal handler during tests 212s app.init_signal = lambda : None 212s # clear log handlers and propagate to root for nose to capture it 212s # needs to be redone after initialize, which reconfigures logging 212s app.log.propagate = True 212s app.log.handlers = [] 212s app.initialize(argv=cls.get_argv()) 212s app.log.propagate = True 212s app.log.handlers = [] 212s loop = IOLoop.current() 212s loop.add_callback(started.set) 212s app.start() 212s finally: 212s # set the event, so failure to start doesn't cause a hang 212s started.set() 212s app.session_manager.close() 212s cls.notebook_thread = Thread(target=start_thread) 212s cls.notebook_thread.daemon = True 212s cls.notebook_thread.start() 212s started.wait() 212s > cls.wait_until_alive() 212s 212s notebook/tests/launchnotebook.py:198: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s ___________ ERROR at setup of NotebookAppTests.test_log_json_default ___________ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s cls.tmp_dir = TemporaryDirectory() 212s def tmp(*parts): 212s path = os.path.join(cls.tmp_dir.name, *parts) 212s try: 212s os.makedirs(path) 212s except OSError as e: 212s if e.errno != errno.EEXIST: 212s raise 212s return path 212s 212s cls.home_dir = tmp('home') 212s data_dir = cls.data_dir = tmp('data') 212s config_dir = cls.config_dir = tmp('config') 212s runtime_dir = cls.runtime_dir = tmp('runtime') 212s cls.notebook_dir = tmp('notebooks') 212s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 212s cls.env_patch.start() 212s # Patch systemwide & user-wide data & config directories, to isolate 212s # the tests from oddities of the local setup. But leave Python env 212s # locations alone, so data files for e.g. nbconvert are accessible. 212s # If this isolation isn't sufficient, you may need to run the tests in 212s # a virtualenv or conda env. 212s cls.path_patch = patch.multiple( 212s jupyter_core.paths, 212s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 212s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 212s ) 212s cls.path_patch.start() 212s 212s config = cls.config or Config() 212s config.NotebookNotary.db_file = ':memory:' 212s 212s cls.token = hexlify(os.urandom(4)).decode('ascii') 212s 212s started = Event() 212s def start_thread(): 212s try: 212s bind_args = cls.get_bind_args() 212s app = cls.notebook = NotebookApp( 212s port_retries=0, 212s open_browser=False, 212s config_dir=cls.config_dir, 212s data_dir=cls.data_dir, 212s runtime_dir=cls.runtime_dir, 212s notebook_dir=cls.notebook_dir, 212s base_url=cls.url_prefix, 212s config=config, 212s allow_root=True, 212s token=cls.token, 212s **bind_args 212s ) 212s if "asyncio" in sys.modules: 212s app._init_asyncio_patch() 212s import asyncio 212s 212s asyncio.set_event_loop(asyncio.new_event_loop()) 212s # Patch the current loop in order to match production 212s # behavior 212s import nest_asyncio 212s 212s nest_asyncio.apply() 212s # don't register signal handler during tests 212s app.init_signal = lambda : None 212s # clear log handlers and propagate to root for nose to capture it 212s # needs to be redone after initialize, which reconfigures logging 212s app.log.propagate = True 212s app.log.handlers = [] 212s app.initialize(argv=cls.get_argv()) 212s app.log.propagate = True 212s app.log.handlers = [] 212s loop = IOLoop.current() 212s loop.add_callback(started.set) 212s app.start() 212s finally: 212s # set the event, so failure to start doesn't cause a hang 212s started.set() 212s app.session_manager.close() 212s cls.notebook_thread = Thread(target=start_thread) 212s cls.notebook_thread.daemon = True 212s cls.notebook_thread.start() 212s started.wait() 212s > cls.wait_until_alive() 212s 212s notebook/tests/launchnotebook.py:198: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s __________ ERROR at setup of NotebookAppTests.test_validate_log_json ___________ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s cls.tmp_dir = TemporaryDirectory() 212s def tmp(*parts): 212s path = os.path.join(cls.tmp_dir.name, *parts) 212s try: 212s os.makedirs(path) 212s except OSError as e: 212s if e.errno != errno.EEXIST: 212s raise 212s return path 212s 212s cls.home_dir = tmp('home') 212s data_dir = cls.data_dir = tmp('data') 212s config_dir = cls.config_dir = tmp('config') 212s runtime_dir = cls.runtime_dir = tmp('runtime') 212s cls.notebook_dir = tmp('notebooks') 212s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 212s cls.env_patch.start() 212s # Patch systemwide & user-wide data & config directories, to isolate 212s # the tests from oddities of the local setup. But leave Python env 212s # locations alone, so data files for e.g. nbconvert are accessible. 212s # If this isolation isn't sufficient, you may need to run the tests in 212s # a virtualenv or conda env. 212s cls.path_patch = patch.multiple( 212s jupyter_core.paths, 212s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 212s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 212s ) 212s cls.path_patch.start() 212s 212s config = cls.config or Config() 212s config.NotebookNotary.db_file = ':memory:' 212s 212s cls.token = hexlify(os.urandom(4)).decode('ascii') 212s 212s started = Event() 212s def start_thread(): 212s try: 212s bind_args = cls.get_bind_args() 212s app = cls.notebook = NotebookApp( 212s port_retries=0, 212s open_browser=False, 212s config_dir=cls.config_dir, 212s data_dir=cls.data_dir, 212s runtime_dir=cls.runtime_dir, 212s notebook_dir=cls.notebook_dir, 212s base_url=cls.url_prefix, 212s config=config, 212s allow_root=True, 212s token=cls.token, 212s **bind_args 212s ) 212s if "asyncio" in sys.modules: 212s app._init_asyncio_patch() 212s import asyncio 212s 212s asyncio.set_event_loop(asyncio.new_event_loop()) 212s # Patch the current loop in order to match production 212s # behavior 212s import nest_asyncio 212s 212s nest_asyncio.apply() 212s # don't register signal handler during tests 212s app.init_signal = lambda : None 212s # clear log handlers and propagate to root for nose to capture it 212s # needs to be redone after initialize, which reconfigures logging 212s app.log.propagate = True 212s app.log.handlers = [] 212s app.initialize(argv=cls.get_argv()) 212s app.log.propagate = True 212s app.log.handlers = [] 212s loop = IOLoop.current() 212s loop.add_callback(started.set) 212s app.start() 212s finally: 212s # set the event, so failure to start doesn't cause a hang 212s started.set() 212s app.session_manager.close() 212s cls.notebook_thread = Thread(target=start_thread) 212s cls.notebook_thread.daemon = True 212s cls.notebook_thread.start() 212s started.wait() 212s > cls.wait_until_alive() 212s 212s notebook/tests/launchnotebook.py:198: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s ___ ERROR at setup of NotebookUnixSocketTests.test_list_running_sock_servers ___ 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def connect(self): 212s sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM) 212s sock.settimeout(self.timeout) 212s socket_path = unquote(urlparse(self.unix_socket_url).netloc) 212s > sock.connect(socket_path) 212s E FileNotFoundError: [Errno 2] No such file or directory 212s 212s /usr/lib/python3/dist-packages/requests_unixsocket/adapters.py:36: FileNotFoundError 212s 212s During handling of the above exception, another exception occurred: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:470: in increment 212s raise reraise(type(error), error, _stacktrace) 212s /usr/lib/python3/dist-packages/urllib3/util/util.py:38: in reraise 212s raise value.with_traceback(tb) 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: in urlopen 212s response = self._make_request( 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def connect(self): 212s sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM) 212s sock.settimeout(self.timeout) 212s socket_path = unquote(urlparse(self.unix_socket_url).netloc) 212s > sock.connect(socket_path) 212s E urllib3.exceptions.ProtocolError: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory')) 212s 212s /usr/lib/python3/dist-packages/requests_unixsocket/adapters.py:36: ProtocolError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:242: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests_unixsocket/__init__.py:51: in get 212s return request('get', url, **kwargs) 212s /usr/lib/python3/dist-packages/requests_unixsocket/__init__.py:46: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s > raise ConnectionError(err, request=request) 212s E requests.exceptions.ConnectionError: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:501: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s cls.tmp_dir = TemporaryDirectory() 212s def tmp(*parts): 212s path = os.path.join(cls.tmp_dir.name, *parts) 212s try: 212s os.makedirs(path) 212s except OSError as e: 212s if e.errno != errno.EEXIST: 212s raise 212s return path 212s 212s cls.home_dir = tmp('home') 212s data_dir = cls.data_dir = tmp('data') 212s config_dir = cls.config_dir = tmp('config') 212s runtime_dir = cls.runtime_dir = tmp('runtime') 212s cls.notebook_dir = tmp('notebooks') 212s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 212s cls.env_patch.start() 212s # Patch systemwide & user-wide data & config directories, to isolate 212s # the tests from oddities of the local setup. But leave Python env 212s # locations alone, so data files for e.g. nbconvert are accessible. 212s # If this isolation isn't sufficient, you may need to run the tests in 212s # a virtualenv or conda env. 212s cls.path_patch = patch.multiple( 212s jupyter_core.paths, 212s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 212s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 212s ) 212s cls.path_patch.start() 212s 212s config = cls.config or Config() 212s config.NotebookNotary.db_file = ':memory:' 212s 212s cls.token = hexlify(os.urandom(4)).decode('ascii') 212s 212s started = Event() 212s def start_thread(): 212s try: 212s bind_args = cls.get_bind_args() 212s app = cls.notebook = NotebookApp( 212s port_retries=0, 212s open_browser=False, 212s config_dir=cls.config_dir, 212s data_dir=cls.data_dir, 212s runtime_dir=cls.runtime_dir, 212s notebook_dir=cls.notebook_dir, 212s base_url=cls.url_prefix, 212s config=config, 212s allow_root=True, 212s token=cls.token, 212s **bind_args 212s ) 212s if "asyncio" in sys.modules: 212s app._init_asyncio_patch() 212s import asyncio 212s 212s asyncio.set_event_loop(asyncio.new_event_loop()) 212s # Patch the current loop in order to match production 212s # behavior 212s import nest_asyncio 212s 212s nest_asyncio.apply() 212s # don't register signal handler during tests 212s app.init_signal = lambda : None 212s # clear log handlers and propagate to root for nose to capture it 212s # needs to be redone after initialize, which reconfigures logging 212s app.log.propagate = True 212s app.log.handlers = [] 212s app.initialize(argv=cls.get_argv()) 212s app.log.propagate = True 212s app.log.handlers = [] 212s loop = IOLoop.current() 212s loop.add_callback(started.set) 212s app.start() 212s finally: 212s # set the event, so failure to start doesn't cause a hang 212s started.set() 212s app.session_manager.close() 212s cls.notebook_thread = Thread(target=start_thread) 212s cls.notebook_thread.daemon = True 212s cls.notebook_thread.start() 212s started.wait() 212s > cls.wait_until_alive() 212s 212s notebook/tests/launchnotebook.py:198: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s ______________ ERROR at setup of NotebookUnixSocketTests.test_run ______________ 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def connect(self): 212s sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM) 212s sock.settimeout(self.timeout) 212s socket_path = unquote(urlparse(self.unix_socket_url).netloc) 212s > sock.connect(socket_path) 212s E FileNotFoundError: [Errno 2] No such file or directory 212s 212s /usr/lib/python3/dist-packages/requests_unixsocket/adapters.py:36: FileNotFoundError 212s 212s During handling of the above exception, another exception occurred: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:470: in increment 212s raise reraise(type(error), error, _stacktrace) 212s /usr/lib/python3/dist-packages/urllib3/util/util.py:38: in reraise 212s raise value.with_traceback(tb) 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: in urlopen 212s response = self._make_request( 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def connect(self): 212s sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM) 212s sock.settimeout(self.timeout) 212s socket_path = unquote(urlparse(self.unix_socket_url).netloc) 212s > sock.connect(socket_path) 212s E urllib3.exceptions.ProtocolError: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory')) 212s 212s /usr/lib/python3/dist-packages/requests_unixsocket/adapters.py:36: ProtocolError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:242: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests_unixsocket/__init__.py:51: in get 212s return request('get', url, **kwargs) 212s /usr/lib/python3/dist-packages/requests_unixsocket/__init__.py:46: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s > raise ConnectionError(err, request=request) 212s E requests.exceptions.ConnectionError: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:501: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s cls.tmp_dir = TemporaryDirectory() 212s def tmp(*parts): 212s path = os.path.join(cls.tmp_dir.name, *parts) 212s try: 212s os.makedirs(path) 212s except OSError as e: 212s if e.errno != errno.EEXIST: 212s raise 212s return path 212s 212s cls.home_dir = tmp('home') 212s data_dir = cls.data_dir = tmp('data') 212s config_dir = cls.config_dir = tmp('config') 212s runtime_dir = cls.runtime_dir = tmp('runtime') 212s cls.notebook_dir = tmp('notebooks') 212s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 212s cls.env_patch.start() 212s # Patch systemwide & user-wide data & config directories, to isolate 212s # the tests from oddities of the local setup. But leave Python env 212s # locations alone, so data files for e.g. nbconvert are accessible. 212s # If this isolation isn't sufficient, you may need to run the tests in 212s # a virtualenv or conda env. 212s cls.path_patch = patch.multiple( 212s jupyter_core.paths, 212s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 212s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 212s ) 212s cls.path_patch.start() 212s 212s config = cls.config or Config() 212s config.NotebookNotary.db_file = ':memory:' 212s 212s cls.token = hexlify(os.urandom(4)).decode('ascii') 212s 212s started = Event() 212s def start_thread(): 212s try: 212s bind_args = cls.get_bind_args() 212s app = cls.notebook = NotebookApp( 212s port_retries=0, 212s open_browser=False, 212s config_dir=cls.config_dir, 212s data_dir=cls.data_dir, 212s runtime_dir=cls.runtime_dir, 212s notebook_dir=cls.notebook_dir, 212s base_url=cls.url_prefix, 212s config=config, 212s allow_root=True, 212s token=cls.token, 212s **bind_args 212s ) 212s if "asyncio" in sys.modules: 212s app._init_asyncio_patch() 212s import asyncio 212s 212s asyncio.set_event_loop(asyncio.new_event_loop()) 212s # Patch the current loop in order to match production 212s # behavior 212s import nest_asyncio 212s 212s nest_asyncio.apply() 212s # don't register signal handler during tests 212s app.init_signal = lambda : None 212s # clear log handlers and propagate to root for nose to capture it 212s # needs to be redone after initialize, which reconfigures logging 212s app.log.propagate = True 212s app.log.handlers = [] 212s app.initialize(argv=cls.get_argv()) 212s app.log.propagate = True 212s app.log.handlers = [] 212s loop = IOLoop.current() 212s loop.add_callback(started.set) 212s app.start() 212s finally: 212s # set the event, so failure to start doesn't cause a hang 212s started.set() 212s app.session_manager.close() 212s cls.notebook_thread = Thread(target=start_thread) 212s cls.notebook_thread.daemon = True 212s cls.notebook_thread.start() 212s started.wait() 212s > cls.wait_until_alive() 212s 212s notebook/tests/launchnotebook.py:198: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s _____ ERROR at setup of NotebookAppJSONLoggingTests.test_log_json_enabled ______ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s > super().setup_class() 212s 212s notebook/tests/test_notebookapp.py:212: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:198: in setup_class 212s cls.wait_until_alive() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s _____ ERROR at setup of NotebookAppJSONLoggingTests.test_validate_log_json _____ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s > super().setup_class() 212s 212s notebook/tests/test_notebookapp.py:212: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:198: in setup_class 212s cls.wait_until_alive() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s ____________ ERROR at setup of RedirectTestCase.test_trailing_slash ____________ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s cls.tmp_dir = TemporaryDirectory() 212s def tmp(*parts): 212s path = os.path.join(cls.tmp_dir.name, *parts) 212s try: 212s os.makedirs(path) 212s except OSError as e: 212s if e.errno != errno.EEXIST: 212s raise 212s return path 212s 212s cls.home_dir = tmp('home') 212s data_dir = cls.data_dir = tmp('data') 212s config_dir = cls.config_dir = tmp('config') 212s runtime_dir = cls.runtime_dir = tmp('runtime') 212s cls.notebook_dir = tmp('notebooks') 212s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 212s cls.env_patch.start() 212s # Patch systemwide & user-wide data & config directories, to isolate 212s # the tests from oddities of the local setup. But leave Python env 212s # locations alone, so data files for e.g. nbconvert are accessible. 212s # If this isolation isn't sufficient, you may need to run the tests in 212s # a virtualenv or conda env. 212s cls.path_patch = patch.multiple( 212s jupyter_core.paths, 212s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 212s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 212s ) 212s cls.path_patch.start() 212s 212s config = cls.config or Config() 212s config.NotebookNotary.db_file = ':memory:' 212s 212s cls.token = hexlify(os.urandom(4)).decode('ascii') 212s 212s started = Event() 212s def start_thread(): 212s try: 212s bind_args = cls.get_bind_args() 212s app = cls.notebook = NotebookApp( 212s port_retries=0, 212s open_browser=False, 212s config_dir=cls.config_dir, 212s data_dir=cls.data_dir, 212s runtime_dir=cls.runtime_dir, 212s notebook_dir=cls.notebook_dir, 212s base_url=cls.url_prefix, 212s config=config, 212s allow_root=True, 212s token=cls.token, 212s **bind_args 212s ) 212s if "asyncio" in sys.modules: 212s app._init_asyncio_patch() 212s import asyncio 212s 212s asyncio.set_event_loop(asyncio.new_event_loop()) 212s # Patch the current loop in order to match production 212s # behavior 212s import nest_asyncio 212s 212s nest_asyncio.apply() 212s # don't register signal handler during tests 212s app.init_signal = lambda : None 212s # clear log handlers and propagate to root for nose to capture it 212s # needs to be redone after initialize, which reconfigures logging 212s app.log.propagate = True 212s app.log.handlers = [] 212s app.initialize(argv=cls.get_argv()) 212s app.log.propagate = True 212s app.log.handlers = [] 212s loop = IOLoop.current() 212s loop.add_callback(started.set) 212s app.start() 212s finally: 212s # set the event, so failure to start doesn't cause a hang 212s started.set() 212s app.session_manager.close() 212s cls.notebook_thread = Thread(target=start_thread) 212s cls.notebook_thread.daemon = True 212s cls.notebook_thread.start() 212s started.wait() 212s > cls.wait_until_alive() 212s 212s notebook/tests/launchnotebook.py:198: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s ___________________ ERROR at setup of TreeTest.test_redirect ___________________ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s > sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:203: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection 212s raise err 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s address = ('localhost', 12341), timeout = None, source_address = None 212s socket_options = [(6, 1, 1)] 212s 212s def create_connection( 212s address: tuple[str, int], 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s source_address: tuple[str, int] | None = None, 212s socket_options: _TYPE_SOCKET_OPTIONS | None = None, 212s ) -> socket.socket: 212s """Connect to *address* and return the socket object. 212s 212s Convenience function. Connect to *address* (a 2-tuple ``(host, 212s port)``) and return the socket object. Passing the optional 212s *timeout* parameter will set the timeout on the socket instance 212s before attempting to connect. If no *timeout* is supplied, the 212s global default timeout setting returned by :func:`socket.getdefaulttimeout` 212s is used. If *source_address* is set it must be a tuple of (host, port) 212s for the socket to bind as a source address before making the connection. 212s An host of '' or port 0 tells the OS to use the default. 212s """ 212s 212s host, port = address 212s if host.startswith("["): 212s host = host.strip("[]") 212s err = None 212s 212s # Using the value from allowed_gai_family() in the context of getaddrinfo lets 212s # us select whether to work with IPv4 DNS records, IPv6 records, or both. 212s # The original create_connection function always returns all records. 212s family = allowed_gai_family() 212s 212s try: 212s host.encode("idna") 212s except UnicodeError: 212s raise LocationParseError(f"'{host}', label empty or too long") from None 212s 212s for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 212s af, socktype, proto, canonname, sa = res 212s sock = None 212s try: 212s sock = socket.socket(af, socktype, proto) 212s 212s # If provided, set socket level options before connecting. 212s _set_socket_options(sock, socket_options) 212s 212s if timeout is not _DEFAULT_TIMEOUT: 212s sock.settimeout(timeout) 212s if source_address: 212s sock.bind(source_address) 212s > sock.connect(sa) 212s E ConnectionRefusedError: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s method = 'GET', url = '/a%40b/api/contents', body = None 212s headers = {'User-Agent': 'python-requests/2.31.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} 212s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s redirect = False, assert_same_host = False 212s timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None 212s release_conn = False, chunked = False, body_pos = None, preload_content = False 212s decode_content = False, response_kw = {} 212s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/a%40b/api/contents', query=None, fragment=None) 212s destination_scheme = None, conn = None, release_this_conn = True 212s http_tunnel_required = False, err = None, clean_exit = False 212s 212s def urlopen( # type: ignore[override] 212s self, 212s method: str, 212s url: str, 212s body: _TYPE_BODY | None = None, 212s headers: typing.Mapping[str, str] | None = None, 212s retries: Retry | bool | int | None = None, 212s redirect: bool = True, 212s assert_same_host: bool = True, 212s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 212s pool_timeout: int | None = None, 212s release_conn: bool | None = None, 212s chunked: bool = False, 212s body_pos: _TYPE_BODY_POSITION | None = None, 212s preload_content: bool = True, 212s decode_content: bool = True, 212s **response_kw: typing.Any, 212s ) -> BaseHTTPResponse: 212s """ 212s Get a connection from the pool and perform an HTTP request. This is the 212s lowest level call for making a request, so you'll need to specify all 212s the raw details. 212s 212s .. note:: 212s 212s More commonly, it's appropriate to use a convenience method 212s such as :meth:`request`. 212s 212s .. note:: 212s 212s `release_conn` will only behave as expected if 212s `preload_content=False` because we want to make 212s `preload_content=False` the default behaviour someday soon without 212s breaking backwards compatibility. 212s 212s :param method: 212s HTTP request method (such as GET, POST, PUT, etc.) 212s 212s :param url: 212s The URL to perform the request on. 212s 212s :param body: 212s Data to send in the request body, either :class:`str`, :class:`bytes`, 212s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 212s 212s :param headers: 212s Dictionary of custom headers to send, such as User-Agent, 212s If-None-Match, etc. If None, pool headers are used. If provided, 212s these headers completely replace any pool-specific headers. 212s 212s :param retries: 212s Configure the number of retries to allow before raising a 212s :class:`~urllib3.exceptions.MaxRetryError` exception. 212s 212s Pass ``None`` to retry until you receive a response. Pass a 212s :class:`~urllib3.util.retry.Retry` object for fine-grained control 212s over different types of retries. 212s Pass an integer number to retry connection errors that many times, 212s but no other types of errors. Pass zero to never retry. 212s 212s If ``False``, then retries are disabled and any exception is raised 212s immediately. Also, instead of raising a MaxRetryError on redirects, 212s the redirect response will be returned. 212s 212s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 212s 212s :param redirect: 212s If True, automatically handle redirects (status codes 301, 302, 212s 303, 307, 308). Each redirect counts as a retry. Disabling retries 212s will disable redirect, too. 212s 212s :param assert_same_host: 212s If ``True``, will make sure that the host of the pool requests is 212s consistent else will raise HostChangedError. When ``False``, you can 212s use the pool on an HTTP proxy and request foreign hosts. 212s 212s :param timeout: 212s If specified, overrides the default timeout for this one 212s request. It may be a float (in seconds) or an instance of 212s :class:`urllib3.util.Timeout`. 212s 212s :param pool_timeout: 212s If set and the pool is set to block=True, then this method will 212s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 212s connection is available within the time period. 212s 212s :param bool preload_content: 212s If True, the response's body will be preloaded into memory. 212s 212s :param bool decode_content: 212s If True, will attempt to decode the body based on the 212s 'content-encoding' header. 212s 212s :param release_conn: 212s If False, then the urlopen call will not release the connection 212s back into the pool once a response is received (but will release if 212s you read the entire contents of the response such as when 212s `preload_content=True`). This is useful if you're not preloading 212s the response's content immediately. You will need to call 212s ``r.release_conn()`` on the response ``r`` to return the connection 212s back into the pool. If None, it takes the value of ``preload_content`` 212s which defaults to ``True``. 212s 212s :param bool chunked: 212s If True, urllib3 will send the body using chunked transfer 212s encoding. Otherwise, urllib3 will send the body using the standard 212s content-length form. Defaults to False. 212s 212s :param int body_pos: 212s Position to seek to in file-like body in the event of a retry or 212s redirect. Typically this won't need to be set because urllib3 will 212s auto-populate the value when needed. 212s """ 212s parsed_url = parse_url(url) 212s destination_scheme = parsed_url.scheme 212s 212s if headers is None: 212s headers = self.headers 212s 212s if not isinstance(retries, Retry): 212s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 212s 212s if release_conn is None: 212s release_conn = preload_content 212s 212s # Check host 212s if assert_same_host and not self.is_same_host(url): 212s raise HostChangedError(self, url, retries) 212s 212s # Ensure that the URL we're connecting to is properly encoded 212s if url.startswith("/"): 212s url = to_str(_encode_target(url)) 212s else: 212s url = to_str(parsed_url.url) 212s 212s conn = None 212s 212s # Track whether `conn` needs to be released before 212s # returning/raising/recursing. Update this variable if necessary, and 212s # leave `release_conn` constant throughout the function. That way, if 212s # the function recurses, the original value of `release_conn` will be 212s # passed down into the recursive call, and its value will be respected. 212s # 212s # See issue #651 [1] for details. 212s # 212s # [1] 212s release_this_conn = release_conn 212s 212s http_tunnel_required = connection_requires_http_tunnel( 212s self.proxy, self.proxy_config, destination_scheme 212s ) 212s 212s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 212s # have to copy the headers dict so we can safely change it without those 212s # changes being reflected in anyone else's copy. 212s if not http_tunnel_required: 212s headers = headers.copy() # type: ignore[attr-defined] 212s headers.update(self.proxy_headers) # type: ignore[union-attr] 212s 212s # Must keep the exception bound to a separate variable or else Python 3 212s # complains about UnboundLocalError. 212s err = None 212s 212s # Keep track of whether we cleanly exited the except block. This 212s # ensures we do proper cleanup in finally. 212s clean_exit = False 212s 212s # Rewind body position, if needed. Record current position 212s # for future rewinds in the event of a redirect/retry. 212s body_pos = set_file_position(body, body_pos) 212s 212s try: 212s # Request a connection from the queue. 212s timeout_obj = self._get_timeout(timeout) 212s conn = self._get_conn(timeout=pool_timeout) 212s 212s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 212s 212s # Is this a closed/new connection that requires CONNECT tunnelling? 212s if self.proxy is not None and http_tunnel_required and conn.is_closed: 212s try: 212s self._prepare_proxy(conn) 212s except (BaseSSLError, OSError, SocketTimeout) as e: 212s self._raise_timeout( 212s err=e, url=self.proxy.url, timeout_value=conn.timeout 212s ) 212s raise 212s 212s # If we're going to release the connection in ``finally:``, then 212s # the response doesn't need to know about the connection. Otherwise 212s # it will also try to release it and we'll have a double-release 212s # mess. 212s response_conn = conn if not release_conn else None 212s 212s # Make the request on the HTTPConnection object 212s > response = self._make_request( 212s conn, 212s method, 212s url, 212s timeout=timeout_obj, 212s body=body, 212s headers=headers, 212s chunked=chunked, 212s retries=retries, 212s response_conn=response_conn, 212s preload_content=preload_content, 212s decode_content=decode_content, 212s **response_kw, 212s ) 212s 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:497: in _make_request 212s conn.request( 212s /usr/lib/python3/dist-packages/urllib3/connection.py:395: in request 212s self.endheaders() 212s /usr/lib/python3.12/http/client.py:1331: in endheaders 212s self._send_output(message_body, encode_chunked=encode_chunked) 212s /usr/lib/python3.12/http/client.py:1091: in _send_output 212s self.send(msg) 212s /usr/lib/python3.12/http/client.py:1035: in send 212s self.connect() 212s /usr/lib/python3/dist-packages/urllib3/connection.py:243: in connect 212s self.sock = self._new_conn() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _new_conn(self) -> socket.socket: 212s """Establish a socket connection and set nodelay settings on it. 212s 212s :return: New socket connection. 212s """ 212s try: 212s sock = connection.create_connection( 212s (self._dns_host, self.port), 212s self.timeout, 212s source_address=self.source_address, 212s socket_options=self.socket_options, 212s ) 212s except socket.gaierror as e: 212s raise NameResolutionError(self.host, self, e) from e 212s except SocketTimeout as e: 212s raise ConnectTimeoutError( 212s self, 212s f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 212s ) from e 212s 212s except OSError as e: 212s > raise NewConnectionError( 212s self, f"Failed to establish a new connection: {e}" 212s ) from e 212s E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 212s 212s /usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s > resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:486: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen 212s retries = retries.increment( 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 212s method = 'GET', url = '/a%40b/api/contents', response = None 212s error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 212s _pool = 212s _stacktrace = 212s 212s def increment( 212s self, 212s method: str | None = None, 212s url: str | None = None, 212s response: BaseHTTPResponse | None = None, 212s error: Exception | None = None, 212s _pool: ConnectionPool | None = None, 212s _stacktrace: TracebackType | None = None, 212s ) -> Retry: 212s """Return a new Retry object with incremented retry counters. 212s 212s :param response: A response object, or None, if the server did not 212s return a response. 212s :type response: :class:`~urllib3.response.BaseHTTPResponse` 212s :param Exception error: An error encountered during the request, or 212s None if the response was received successfully. 212s 212s :return: A new ``Retry`` object. 212s """ 212s if self.total is False and error: 212s # Disabled, indicate to re-raise the error. 212s raise reraise(type(error), error, _stacktrace) 212s 212s total = self.total 212s if total is not None: 212s total -= 1 212s 212s connect = self.connect 212s read = self.read 212s redirect = self.redirect 212s status_count = self.status 212s other = self.other 212s cause = "unknown" 212s status = None 212s redirect_location = None 212s 212s if error and self._is_connection_error(error): 212s # Connect retry? 212s if connect is False: 212s raise reraise(type(error), error, _stacktrace) 212s elif connect is not None: 212s connect -= 1 212s 212s elif error and self._is_read_error(error): 212s # Read retry? 212s if read is False or method is None or not self._is_method_retryable(method): 212s raise reraise(type(error), error, _stacktrace) 212s elif read is not None: 212s read -= 1 212s 212s elif error: 212s # Other retry? 212s if other is not None: 212s other -= 1 212s 212s elif response and response.get_redirect_location(): 212s # Redirect retry? 212s if redirect is not None: 212s redirect -= 1 212s cause = "too many redirects" 212s response_redirect_location = response.get_redirect_location() 212s if response_redirect_location: 212s redirect_location = response_redirect_location 212s status = response.status 212s 212s else: 212s # Incrementing because of a server error like a 500 in 212s # status_forcelist and the given method is in the allowed_methods 212s cause = ResponseError.GENERIC_ERROR 212s if response and response.status: 212s if status_count is not None: 212s status_count -= 1 212s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 212s status = response.status 212s 212s history = self.history + ( 212s RequestHistory(method, url, error, status, redirect_location), 212s ) 212s 212s new_retry = self.new( 212s total=total, 212s connect=connect, 212s read=read, 212s redirect=redirect, 212s status=status_count, 212s other=other, 212s history=history, 212s ) 212s 212s if new_retry.is_exhausted(): 212s reason = error or ResponseError(cause) 212s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 212s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError 212s 212s During handling of the above exception, another exception occurred: 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s > cls.fetch_url(url) 212s 212s notebook/tests/launchnotebook.py:53: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s notebook/tests/launchnotebook.py:82: in fetch_url 212s return requests.get(url) 212s /usr/lib/python3/dist-packages/requests/api.py:73: in get 212s return request("get", url, params=params, **kwargs) 212s /usr/lib/python3/dist-packages/requests/api.py:59: in request 212s return session.request(method=method, url=url, **kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 212s resp = self.send(prep, **send_kwargs) 212s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 212s r = adapter.send(request, **kwargs) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s request = , stream = False 212s timeout = Timeout(connect=None, read=None, total=None), verify = True 212s cert = None, proxies = OrderedDict() 212s 212s def send( 212s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 212s ): 212s """Sends PreparedRequest object. Returns Response object. 212s 212s :param request: The :class:`PreparedRequest ` being sent. 212s :param stream: (optional) Whether to stream the request content. 212s :param timeout: (optional) How long to wait for the server to send 212s data before giving up, as a float, or a :ref:`(connect timeout, 212s read timeout) ` tuple. 212s :type timeout: float or tuple or urllib3 Timeout object 212s :param verify: (optional) Either a boolean, in which case it controls whether 212s we verify the server's TLS certificate, or a string, in which case it 212s must be a path to a CA bundle to use 212s :param cert: (optional) Any user-provided SSL certificate to be trusted. 212s :param proxies: (optional) The proxies dictionary to apply to the request. 212s :rtype: requests.Response 212s """ 212s 212s try: 212s conn = self.get_connection(request.url, proxies) 212s except LocationValueError as e: 212s raise InvalidURL(e, request=request) 212s 212s self.cert_verify(conn, request.url, verify, cert) 212s url = self.request_url(request, proxies) 212s self.add_headers( 212s request, 212s stream=stream, 212s timeout=timeout, 212s verify=verify, 212s cert=cert, 212s proxies=proxies, 212s ) 212s 212s chunked = not (request.body is None or "Content-Length" in request.headers) 212s 212s if isinstance(timeout, tuple): 212s try: 212s connect, read = timeout 212s timeout = TimeoutSauce(connect=connect, read=read) 212s except ValueError: 212s raise ValueError( 212s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 212s f"or a single float to set both timeouts to the same value." 212s ) 212s elif isinstance(timeout, TimeoutSauce): 212s pass 212s else: 212s timeout = TimeoutSauce(connect=timeout, read=timeout) 212s 212s try: 212s resp = conn.urlopen( 212s method=request.method, 212s url=url, 212s body=request.body, 212s headers=request.headers, 212s redirect=False, 212s assert_same_host=False, 212s preload_content=False, 212s decode_content=False, 212s retries=self.max_retries, 212s timeout=timeout, 212s chunked=chunked, 212s ) 212s 212s except (ProtocolError, OSError) as err: 212s raise ConnectionError(err, request=request) 212s 212s except MaxRetryError as e: 212s if isinstance(e.reason, ConnectTimeoutError): 212s # TODO: Remove this in 3.0.0: see #2811 212s if not isinstance(e.reason, NewConnectionError): 212s raise ConnectTimeout(e, request=request) 212s 212s if isinstance(e.reason, ResponseError): 212s raise RetryError(e, request=request) 212s 212s if isinstance(e.reason, _ProxyError): 212s raise ProxyError(e, request=request) 212s 212s if isinstance(e.reason, _SSLError): 212s # This branch is for urllib3 v1.22 and later. 212s raise SSLError(e, request=request) 212s 212s > raise ConnectionError(e, request=request) 212s E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=12341): Max retries exceeded with url: /a%40b/api/contents (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 212s 212s /usr/lib/python3/dist-packages/requests/adapters.py:519: ConnectionError 212s 212s The above exception was the direct cause of the following exception: 212s 212s cls = 212s 212s @classmethod 212s def setup_class(cls): 212s cls.tmp_dir = TemporaryDirectory() 212s def tmp(*parts): 212s path = os.path.join(cls.tmp_dir.name, *parts) 212s try: 212s os.makedirs(path) 212s except OSError as e: 212s if e.errno != errno.EEXIST: 212s raise 212s return path 212s 212s cls.home_dir = tmp('home') 212s data_dir = cls.data_dir = tmp('data') 212s config_dir = cls.config_dir = tmp('config') 212s runtime_dir = cls.runtime_dir = tmp('runtime') 212s cls.notebook_dir = tmp('notebooks') 212s cls.env_patch = patch.dict('os.environ', cls.get_patch_env()) 212s cls.env_patch.start() 212s # Patch systemwide & user-wide data & config directories, to isolate 212s # the tests from oddities of the local setup. But leave Python env 212s # locations alone, so data files for e.g. nbconvert are accessible. 212s # If this isolation isn't sufficient, you may need to run the tests in 212s # a virtualenv or conda env. 212s cls.path_patch = patch.multiple( 212s jupyter_core.paths, 212s SYSTEM_JUPYTER_PATH=[tmp('share', 'jupyter')], 212s SYSTEM_CONFIG_PATH=[tmp('etc', 'jupyter')], 212s ) 212s cls.path_patch.start() 212s 212s config = cls.config or Config() 212s config.NotebookNotary.db_file = ':memory:' 212s 212s cls.token = hexlify(os.urandom(4)).decode('ascii') 212s 212s started = Event() 212s def start_thread(): 212s try: 212s bind_args = cls.get_bind_args() 212s app = cls.notebook = NotebookApp( 212s port_retries=0, 212s open_browser=False, 212s config_dir=cls.config_dir, 212s data_dir=cls.data_dir, 212s runtime_dir=cls.runtime_dir, 212s notebook_dir=cls.notebook_dir, 212s base_url=cls.url_prefix, 212s config=config, 212s allow_root=True, 212s token=cls.token, 212s **bind_args 212s ) 212s if "asyncio" in sys.modules: 212s app._init_asyncio_patch() 212s import asyncio 212s 212s asyncio.set_event_loop(asyncio.new_event_loop()) 212s # Patch the current loop in order to match production 212s # behavior 212s import nest_asyncio 212s 212s nest_asyncio.apply() 212s # don't register signal handler during tests 212s app.init_signal = lambda : None 212s # clear log handlers and propagate to root for nose to capture it 212s # needs to be redone after initialize, which reconfigures logging 212s app.log.propagate = True 212s app.log.handlers = [] 212s app.initialize(argv=cls.get_argv()) 212s app.log.propagate = True 212s app.log.handlers = [] 212s loop = IOLoop.current() 212s loop.add_callback(started.set) 212s app.start() 212s finally: 212s # set the event, so failure to start doesn't cause a hang 212s started.set() 212s app.session_manager.close() 212s cls.notebook_thread = Thread(target=start_thread) 212s cls.notebook_thread.daemon = True 212s cls.notebook_thread.start() 212s started.wait() 212s > cls.wait_until_alive() 212s 212s notebook/tests/launchnotebook.py:198: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s cls = 212s 212s @classmethod 212s def wait_until_alive(cls): 212s """Wait for the server to be alive""" 212s url = cls.base_url() + 'api/contents' 212s for _ in range(int(MAX_WAITTIME/POLL_INTERVAL)): 212s try: 212s cls.fetch_url(url) 212s except ModuleNotFoundError as error: 212s # Errors that should be immediately thrown back to caller 212s raise error 212s except Exception as e: 212s if not cls.notebook_thread.is_alive(): 212s > raise RuntimeError("The notebook server failed to start") from e 212s E RuntimeError: The notebook server failed to start 212s 212s notebook/tests/launchnotebook.py:59: RuntimeError 212s =================================== FAILURES =================================== 212s __________________ TestSessionManager.test_bad_delete_session __________________ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s > klass = self._resolve_string(klass) 212s 212s notebook/traittypes.py:336: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 212s return import_item(string) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s name = 'jupyter_server.services.contents.manager.ContentsManager' 212s 212s def import_item(name: str) -> Any: 212s """Import and return ``bar`` given the string ``foo.bar``. 212s 212s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 212s executing the code ``from foo import bar``. 212s 212s Parameters 212s ---------- 212s name : string 212s The fully qualified name of the module/package being imported. 212s 212s Returns 212s ------- 212s mod : module object 212s The module that was imported. 212s """ 212s if not isinstance(name, str): 212s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 212s parts = name.rsplit(".", 1) 212s if len(parts) == 2: 212s # called with 'foo.bar....' 212s package, obj = parts 212s > module = __import__(package, fromlist=[obj]) 212s E ModuleNotFoundError: No module named 'jupyter_server' 212s 212s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 212s 212s During handling of the above exception, another exception occurred: 212s 212s self = 212s 212s def setUp(self): 212s > self.sm = SessionManager( 212s kernel_manager=DummyMKM(), 212s contents_manager=ContentsManager(), 212s ) 212s 212s notebook/services/sessions/tests/test_sessionmanager.py:45: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 212s inst.setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 212s init(self) 212s notebook/traittypes.py:327: in instance_init 212s self._resolve_classes() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s klass = self._resolve_string(klass) 212s self.importable_klasses.append(klass) 212s except: 212s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s notebook/traittypes.py:339: TypeError 212s ___________________ TestSessionManager.test_bad_get_session ____________________ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s > klass = self._resolve_string(klass) 212s 212s notebook/traittypes.py:336: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 212s return import_item(string) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s name = 'jupyter_server.services.contents.manager.ContentsManager' 212s 212s def import_item(name: str) -> Any: 212s """Import and return ``bar`` given the string ``foo.bar``. 212s 212s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 212s executing the code ``from foo import bar``. 212s 212s Parameters 212s ---------- 212s name : string 212s The fully qualified name of the module/package being imported. 212s 212s Returns 212s ------- 212s mod : module object 212s The module that was imported. 212s """ 212s if not isinstance(name, str): 212s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 212s parts = name.rsplit(".", 1) 212s if len(parts) == 2: 212s # called with 'foo.bar....' 212s package, obj = parts 212s > module = __import__(package, fromlist=[obj]) 212s E ModuleNotFoundError: No module named 'jupyter_server' 212s 212s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 212s 212s During handling of the above exception, another exception occurred: 212s 212s self = 212s 212s def setUp(self): 212s > self.sm = SessionManager( 212s kernel_manager=DummyMKM(), 212s contents_manager=ContentsManager(), 212s ) 212s 212s notebook/services/sessions/tests/test_sessionmanager.py:45: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 212s inst.setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 212s init(self) 212s notebook/traittypes.py:327: in instance_init 212s self._resolve_classes() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s klass = self._resolve_string(klass) 212s self.importable_klasses.append(klass) 212s except: 212s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s notebook/traittypes.py:339: TypeError 212s __________________ TestSessionManager.test_bad_update_session __________________ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s > klass = self._resolve_string(klass) 212s 212s notebook/traittypes.py:336: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 212s return import_item(string) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s name = 'jupyter_server.services.contents.manager.ContentsManager' 212s 212s def import_item(name: str) -> Any: 212s """Import and return ``bar`` given the string ``foo.bar``. 212s 212s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 212s executing the code ``from foo import bar``. 212s 212s Parameters 212s ---------- 212s name : string 212s The fully qualified name of the module/package being imported. 212s 212s Returns 212s ------- 212s mod : module object 212s The module that was imported. 212s """ 212s if not isinstance(name, str): 212s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 212s parts = name.rsplit(".", 1) 212s if len(parts) == 2: 212s # called with 'foo.bar....' 212s package, obj = parts 212s > module = __import__(package, fromlist=[obj]) 212s E ModuleNotFoundError: No module named 'jupyter_server' 212s 212s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 212s 212s During handling of the above exception, another exception occurred: 212s 212s self = 212s 212s def setUp(self): 212s > self.sm = SessionManager( 212s kernel_manager=DummyMKM(), 212s contents_manager=ContentsManager(), 212s ) 212s 212s notebook/services/sessions/tests/test_sessionmanager.py:45: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 212s inst.setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 212s init(self) 212s notebook/traittypes.py:327: in instance_init 212s self._resolve_classes() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s klass = self._resolve_string(klass) 212s self.importable_klasses.append(klass) 212s except: 212s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s notebook/traittypes.py:339: TypeError 212s ____________________ TestSessionManager.test_delete_session ____________________ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s > klass = self._resolve_string(klass) 212s 212s notebook/traittypes.py:336: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 212s return import_item(string) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s name = 'jupyter_server.services.contents.manager.ContentsManager' 212s 212s def import_item(name: str) -> Any: 212s """Import and return ``bar`` given the string ``foo.bar``. 212s 212s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 212s executing the code ``from foo import bar``. 212s 212s Parameters 212s ---------- 212s name : string 212s The fully qualified name of the module/package being imported. 212s 212s Returns 212s ------- 212s mod : module object 212s The module that was imported. 212s """ 212s if not isinstance(name, str): 212s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 212s parts = name.rsplit(".", 1) 212s if len(parts) == 2: 212s # called with 'foo.bar....' 212s package, obj = parts 212s > module = __import__(package, fromlist=[obj]) 212s E ModuleNotFoundError: No module named 'jupyter_server' 212s 212s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 212s 212s During handling of the above exception, another exception occurred: 212s 212s self = 212s 212s def setUp(self): 212s > self.sm = SessionManager( 212s kernel_manager=DummyMKM(), 212s contents_manager=ContentsManager(), 212s ) 212s 212s notebook/services/sessions/tests/test_sessionmanager.py:45: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 212s inst.setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 212s init(self) 212s notebook/traittypes.py:327: in instance_init 212s self._resolve_classes() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s klass = self._resolve_string(klass) 212s self.importable_klasses.append(klass) 212s except: 212s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s notebook/traittypes.py:339: TypeError 212s _____________________ TestSessionManager.test_get_session ______________________ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s > klass = self._resolve_string(klass) 212s 212s notebook/traittypes.py:336: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 212s return import_item(string) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s name = 'jupyter_server.services.contents.manager.ContentsManager' 212s 212s def import_item(name: str) -> Any: 212s """Import and return ``bar`` given the string ``foo.bar``. 212s 212s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 212s executing the code ``from foo import bar``. 212s 212s Parameters 212s ---------- 212s name : string 212s The fully qualified name of the module/package being imported. 212s 212s Returns 212s ------- 212s mod : module object 212s The module that was imported. 212s """ 212s if not isinstance(name, str): 212s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 212s parts = name.rsplit(".", 1) 212s if len(parts) == 2: 212s # called with 'foo.bar....' 212s package, obj = parts 212s > module = __import__(package, fromlist=[obj]) 212s E ModuleNotFoundError: No module named 'jupyter_server' 212s 212s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 212s 212s During handling of the above exception, another exception occurred: 212s 212s self = 212s 212s def setUp(self): 212s > self.sm = SessionManager( 212s kernel_manager=DummyMKM(), 212s contents_manager=ContentsManager(), 212s ) 212s 212s notebook/services/sessions/tests/test_sessionmanager.py:45: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 212s inst.setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 212s init(self) 212s notebook/traittypes.py:327: in instance_init 212s self._resolve_classes() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s klass = self._resolve_string(klass) 212s self.importable_klasses.append(klass) 212s except: 212s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s notebook/traittypes.py:339: TypeError 212s _______________ TestSessionManager.test_get_session_dead_kernel ________________ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s > klass = self._resolve_string(klass) 212s 212s notebook/traittypes.py:336: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 212s return import_item(string) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s name = 'jupyter_server.services.contents.manager.ContentsManager' 212s 212s def import_item(name: str) -> Any: 212s """Import and return ``bar`` given the string ``foo.bar``. 212s 212s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 212s executing the code ``from foo import bar``. 212s 212s Parameters 212s ---------- 212s name : string 212s The fully qualified name of the module/package being imported. 212s 212s Returns 212s ------- 212s mod : module object 212s The module that was imported. 212s """ 212s if not isinstance(name, str): 212s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 212s parts = name.rsplit(".", 1) 212s if len(parts) == 2: 212s # called with 'foo.bar....' 212s package, obj = parts 212s > module = __import__(package, fromlist=[obj]) 212s E ModuleNotFoundError: No module named 'jupyter_server' 212s 212s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 212s 212s During handling of the above exception, another exception occurred: 212s 212s self = 212s 212s def setUp(self): 212s > self.sm = SessionManager( 212s kernel_manager=DummyMKM(), 212s contents_manager=ContentsManager(), 212s ) 212s 212s notebook/services/sessions/tests/test_sessionmanager.py:45: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 212s inst.setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 212s init(self) 212s notebook/traittypes.py:327: in instance_init 212s self._resolve_classes() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s klass = self._resolve_string(klass) 212s self.importable_klasses.append(klass) 212s except: 212s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s notebook/traittypes.py:339: TypeError 212s ____________________ TestSessionManager.test_list_sessions _____________________ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s > klass = self._resolve_string(klass) 212s 212s notebook/traittypes.py:336: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 212s return import_item(string) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s name = 'jupyter_server.services.contents.manager.ContentsManager' 212s 212s def import_item(name: str) -> Any: 212s """Import and return ``bar`` given the string ``foo.bar``. 212s 212s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 212s executing the code ``from foo import bar``. 212s 212s Parameters 212s ---------- 212s name : string 212s The fully qualified name of the module/package being imported. 212s 212s Returns 212s ------- 212s mod : module object 212s The module that was imported. 212s """ 212s if not isinstance(name, str): 212s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 212s parts = name.rsplit(".", 1) 212s if len(parts) == 2: 212s # called with 'foo.bar....' 212s package, obj = parts 212s > module = __import__(package, fromlist=[obj]) 212s E ModuleNotFoundError: No module named 'jupyter_server' 212s 212s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 212s 212s During handling of the above exception, another exception occurred: 212s 212s self = 212s 212s def setUp(self): 212s > self.sm = SessionManager( 212s kernel_manager=DummyMKM(), 212s contents_manager=ContentsManager(), 212s ) 212s 212s notebook/services/sessions/tests/test_sessionmanager.py:45: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 212s inst.setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 212s init(self) 212s notebook/traittypes.py:327: in instance_init 212s self._resolve_classes() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s klass = self._resolve_string(klass) 212s self.importable_klasses.append(klass) 212s except: 212s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s notebook/traittypes.py:339: TypeError 212s ______________ TestSessionManager.test_list_sessions_dead_kernel _______________ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s > klass = self._resolve_string(klass) 212s 212s notebook/traittypes.py:336: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 212s return import_item(string) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s name = 'jupyter_server.services.contents.manager.ContentsManager' 212s 212s def import_item(name: str) -> Any: 212s """Import and return ``bar`` given the string ``foo.bar``. 212s 212s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 212s executing the code ``from foo import bar``. 212s 212s Parameters 212s ---------- 212s name : string 212s The fully qualified name of the module/package being imported. 212s 212s Returns 212s ------- 212s mod : module object 212s The module that was imported. 212s """ 212s if not isinstance(name, str): 212s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 212s parts = name.rsplit(".", 1) 212s if len(parts) == 2: 212s # called with 'foo.bar....' 212s package, obj = parts 212s > module = __import__(package, fromlist=[obj]) 212s E ModuleNotFoundError: No module named 'jupyter_server' 212s 212s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 212s 212s During handling of the above exception, another exception occurred: 212s 212s self = 212s 212s def setUp(self): 212s > self.sm = SessionManager( 212s kernel_manager=DummyMKM(), 212s contents_manager=ContentsManager(), 212s ) 212s 212s notebook/services/sessions/tests/test_sessionmanager.py:45: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 212s inst.setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 212s init(self) 212s notebook/traittypes.py:327: in instance_init 212s self._resolve_classes() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s klass = self._resolve_string(klass) 212s self.importable_klasses.append(klass) 212s except: 212s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s notebook/traittypes.py:339: TypeError 212s ____________________ TestSessionManager.test_update_session ____________________ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s > klass = self._resolve_string(klass) 212s 212s notebook/traittypes.py:336: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 212s return import_item(string) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s name = 'jupyter_server.services.contents.manager.ContentsManager' 212s 212s def import_item(name: str) -> Any: 212s """Import and return ``bar`` given the string ``foo.bar``. 212s 212s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 212s executing the code ``from foo import bar``. 212s 212s Parameters 212s ---------- 212s name : string 212s The fully qualified name of the module/package being imported. 212s 212s Returns 212s ------- 212s mod : module object 212s The module that was imported. 212s """ 212s if not isinstance(name, str): 212s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 212s parts = name.rsplit(".", 1) 212s if len(parts) == 2: 212s # called with 'foo.bar....' 212s package, obj = parts 212s > module = __import__(package, fromlist=[obj]) 212s E ModuleNotFoundError: No module named 'jupyter_server' 212s 212s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 212s 212s During handling of the above exception, another exception occurred: 212s 212s self = 212s 212s def setUp(self): 212s > self.sm = SessionManager( 212s kernel_manager=DummyMKM(), 212s contents_manager=ContentsManager(), 212s ) 212s 212s notebook/services/sessions/tests/test_sessionmanager.py:45: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 212s inst.setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 212s init(self) 212s notebook/traittypes.py:327: in instance_init 212s self._resolve_classes() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s klass = self._resolve_string(klass) 212s self.importable_klasses.append(klass) 212s except: 212s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s notebook/traittypes.py:339: TypeError 212s _______________________________ test_help_output _______________________________ 212s 212s def test_help_output(): 212s """ipython notebook --help-all works""" 212s > check_help_all_output('notebook') 212s 212s notebook/tests/test_notebookapp.py:28: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s pkg = 'notebook', subcommand = None 212s 212s def check_help_all_output(pkg: str, subcommand: Sequence[str] | None = None) -> tuple[str, str]: 212s """test that `python -m PKG --help-all` works""" 212s cmd = [sys.executable, "-m", pkg] 212s if subcommand: 212s cmd.extend(subcommand) 212s cmd.append("--help-all") 212s out, err, rc = get_output_error_code(cmd) 212s > assert rc == 0, err 212s E AssertionError: Traceback (most recent call last): 212s E File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 235, in _resolve_classes 212s E klass = self._resolve_string(klass) 212s E ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s E File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 212s E return import_item(string) 212s E ^^^^^^^^^^^^^^^^^^^ 212s E File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 212s E module = __import__(package, fromlist=[obj]) 212s E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s E ModuleNotFoundError: No module named 'jupyter_server' 212s E 212s E During handling of the above exception, another exception occurred: 212s E 212s E Traceback (most recent call last): 212s E File "", line 198, in _run_module_as_main 212s E File "", line 88, in _run_code 212s E File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/__main__.py", line 3, in 212s E app.launch_new_instance() 212s E File "/usr/lib/python3/dist-packages/jupyter_core/application.py", line 282, in launch_instance 212s E super().launch_instance(argv=argv, **kwargs) 212s E File "/usr/lib/python3/dist-packages/traitlets/config/application.py", line 1073, in launch_instance 212s E app = cls.instance(**kwargs) 212s E ^^^^^^^^^^^^^^^^^^^^^^ 212s E File "/usr/lib/python3/dist-packages/traitlets/config/configurable.py", line 583, in instance 212s E inst = cls(*args, **kwargs) 212s E ^^^^^^^^^^^^^^^^^^^^ 212s E File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 212s E inst.setup_instance(*args, **kwargs) 212s E File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 212s E super(HasTraits, self).setup_instance(*args, **kwargs) 212s E File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 212s E init(self) 212s E File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 226, in instance_init 212s E self._resolve_classes() 212s E File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 238, in _resolve_classes 212s E warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s /usr/lib/python3/dist-packages/traitlets/tests/utils.py:38: AssertionError 212s ____________________________ test_server_info_file _____________________________ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s > klass = self._resolve_string(klass) 212s 212s notebook/traittypes.py:235: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 212s return import_item(string) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s name = 'jupyter_server.contents.services.managers.ContentsManager' 212s 212s def import_item(name: str) -> Any: 212s """Import and return ``bar`` given the string ``foo.bar``. 212s 212s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 212s executing the code ``from foo import bar``. 212s 212s Parameters 212s ---------- 212s name : string 212s The fully qualified name of the module/package being imported. 212s 212s Returns 212s ------- 212s mod : module object 212s The module that was imported. 212s """ 212s if not isinstance(name, str): 212s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 212s parts = name.rsplit(".", 1) 212s if len(parts) == 2: 212s # called with 'foo.bar....' 212s package, obj = parts 212s > module = __import__(package, fromlist=[obj]) 212s E ModuleNotFoundError: No module named 'jupyter_server' 212s 212s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 212s 212s During handling of the above exception, another exception occurred: 212s 212s def test_server_info_file(): 212s td = TemporaryDirectory() 212s > nbapp = NotebookApp(runtime_dir=td.name, log=logging.getLogger()) 212s 212s notebook/tests/test_notebookapp.py:32: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 212s inst.setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 212s init(self) 212s notebook/traittypes.py:226: in instance_init 212s self._resolve_classes() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s klass = self._resolve_string(klass) 212s self.importable_klasses.append(klass) 212s except: 212s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s notebook/traittypes.py:238: TypeError 212s _________________________________ test_nb_dir __________________________________ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s > klass = self._resolve_string(klass) 212s 212s notebook/traittypes.py:235: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 212s return import_item(string) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s name = 'jupyter_server.contents.services.managers.ContentsManager' 212s 212s def import_item(name: str) -> Any: 212s """Import and return ``bar`` given the string ``foo.bar``. 212s 212s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 212s executing the code ``from foo import bar``. 212s 212s Parameters 212s ---------- 212s name : string 212s The fully qualified name of the module/package being imported. 212s 212s Returns 212s ------- 212s mod : module object 212s The module that was imported. 212s """ 212s if not isinstance(name, str): 212s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 212s parts = name.rsplit(".", 1) 212s if len(parts) == 2: 212s # called with 'foo.bar....' 212s package, obj = parts 212s > module = __import__(package, fromlist=[obj]) 212s E ModuleNotFoundError: No module named 'jupyter_server' 212s 212s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 212s 212s During handling of the above exception, another exception occurred: 212s 212s def test_nb_dir(): 212s with TemporaryDirectory() as td: 212s > app = NotebookApp(notebook_dir=td) 212s 212s notebook/tests/test_notebookapp.py:49: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 212s inst.setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 212s init(self) 212s notebook/traittypes.py:226: in instance_init 212s self._resolve_classes() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s klass = self._resolve_string(klass) 212s self.importable_klasses.append(klass) 212s except: 212s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s notebook/traittypes.py:238: TypeError 212s ____________________________ test_no_create_nb_dir _____________________________ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s > klass = self._resolve_string(klass) 212s 212s notebook/traittypes.py:235: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 212s return import_item(string) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s name = 'jupyter_server.contents.services.managers.ContentsManager' 212s 212s def import_item(name: str) -> Any: 212s """Import and return ``bar`` given the string ``foo.bar``. 212s 212s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 212s executing the code ``from foo import bar``. 212s 212s Parameters 212s ---------- 212s name : string 212s The fully qualified name of the module/package being imported. 212s 212s Returns 212s ------- 212s mod : module object 212s The module that was imported. 212s """ 212s if not isinstance(name, str): 212s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 212s parts = name.rsplit(".", 1) 212s if len(parts) == 2: 212s # called with 'foo.bar....' 212s package, obj = parts 212s > module = __import__(package, fromlist=[obj]) 212s E ModuleNotFoundError: No module named 'jupyter_server' 212s 212s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 212s 212s During handling of the above exception, another exception occurred: 212s 212s def test_no_create_nb_dir(): 212s with TemporaryDirectory() as td: 212s nbdir = os.path.join(td, 'notebooks') 212s > app = NotebookApp() 212s 212s notebook/tests/test_notebookapp.py:55: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 212s inst.setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 212s init(self) 212s notebook/traittypes.py:226: in instance_init 212s self._resolve_classes() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s klass = self._resolve_string(klass) 212s self.importable_klasses.append(klass) 212s except: 212s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s notebook/traittypes.py:238: TypeError 212s _____________________________ test_missing_nb_dir ______________________________ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s > klass = self._resolve_string(klass) 212s 212s notebook/traittypes.py:235: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 212s return import_item(string) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s name = 'jupyter_server.contents.services.managers.ContentsManager' 212s 212s def import_item(name: str) -> Any: 212s """Import and return ``bar`` given the string ``foo.bar``. 212s 212s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 212s executing the code ``from foo import bar``. 212s 212s Parameters 212s ---------- 212s name : string 212s The fully qualified name of the module/package being imported. 212s 212s Returns 212s ------- 212s mod : module object 212s The module that was imported. 212s """ 212s if not isinstance(name, str): 212s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 212s parts = name.rsplit(".", 1) 212s if len(parts) == 2: 212s # called with 'foo.bar....' 212s package, obj = parts 212s > module = __import__(package, fromlist=[obj]) 212s E ModuleNotFoundError: No module named 'jupyter_server' 212s 212s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 212s 212s During handling of the above exception, another exception occurred: 212s 212s def test_missing_nb_dir(): 212s with TemporaryDirectory() as td: 212s nbdir = os.path.join(td, 'notebook', 'dir', 'is', 'missing') 212s > app = NotebookApp() 212s 212s notebook/tests/test_notebookapp.py:62: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 212s inst.setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 212s init(self) 212s notebook/traittypes.py:226: in instance_init 212s self._resolve_classes() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s klass = self._resolve_string(klass) 212s self.importable_klasses.append(klass) 212s except: 212s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s notebook/traittypes.py:238: TypeError 212s _____________________________ test_invalid_nb_dir ______________________________ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s > klass = self._resolve_string(klass) 212s 212s notebook/traittypes.py:235: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 212s return import_item(string) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s name = 'jupyter_server.contents.services.managers.ContentsManager' 212s 212s def import_item(name: str) -> Any: 212s """Import and return ``bar`` given the string ``foo.bar``. 212s 212s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 212s executing the code ``from foo import bar``. 212s 212s Parameters 212s ---------- 212s name : string 212s The fully qualified name of the module/package being imported. 212s 212s Returns 212s ------- 212s mod : module object 212s The module that was imported. 212s """ 212s if not isinstance(name, str): 212s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 212s parts = name.rsplit(".", 1) 212s if len(parts) == 2: 212s # called with 'foo.bar....' 212s package, obj = parts 212s > module = __import__(package, fromlist=[obj]) 212s E ModuleNotFoundError: No module named 'jupyter_server' 212s 212s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 212s 212s During handling of the above exception, another exception occurred: 212s 212s def test_invalid_nb_dir(): 212s with NamedTemporaryFile() as tf: 212s > app = NotebookApp() 212s 212s notebook/tests/test_notebookapp.py:68: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 212s inst.setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 212s init(self) 212s notebook/traittypes.py:226: in instance_init 212s self._resolve_classes() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s klass = self._resolve_string(klass) 212s self.importable_klasses.append(klass) 212s except: 212s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s notebook/traittypes.py:238: TypeError 212s ____________________________ test_nb_dir_with_slash ____________________________ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s > klass = self._resolve_string(klass) 212s 212s notebook/traittypes.py:235: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 212s return import_item(string) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s name = 'jupyter_server.contents.services.managers.ContentsManager' 212s 212s def import_item(name: str) -> Any: 212s """Import and return ``bar`` given the string ``foo.bar``. 212s 212s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 212s executing the code ``from foo import bar``. 212s 212s Parameters 212s ---------- 212s name : string 212s The fully qualified name of the module/package being imported. 212s 212s Returns 212s ------- 212s mod : module object 212s The module that was imported. 212s """ 212s if not isinstance(name, str): 212s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 212s parts = name.rsplit(".", 1) 212s if len(parts) == 2: 212s # called with 'foo.bar....' 212s package, obj = parts 212s > module = __import__(package, fromlist=[obj]) 212s E ModuleNotFoundError: No module named 'jupyter_server' 212s 212s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 212s 212s During handling of the above exception, another exception occurred: 212s 212s def test_nb_dir_with_slash(): 212s with TemporaryDirectory(suffix="_slash" + os.sep) as td: 212s > app = NotebookApp(notebook_dir=td) 212s 212s notebook/tests/test_notebookapp.py:74: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 212s inst.setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 212s init(self) 212s notebook/traittypes.py:226: in instance_init 212s self._resolve_classes() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s klass = self._resolve_string(klass) 212s self.importable_klasses.append(klass) 212s except: 212s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s notebook/traittypes.py:238: TypeError 212s _______________________________ test_nb_dir_root _______________________________ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s > klass = self._resolve_string(klass) 212s 212s notebook/traittypes.py:235: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 212s return import_item(string) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s name = 'jupyter_server.contents.services.managers.ContentsManager' 212s 212s def import_item(name: str) -> Any: 212s """Import and return ``bar`` given the string ``foo.bar``. 212s 212s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 212s executing the code ``from foo import bar``. 212s 212s Parameters 212s ---------- 212s name : string 212s The fully qualified name of the module/package being imported. 212s 212s Returns 212s ------- 212s mod : module object 212s The module that was imported. 212s """ 212s if not isinstance(name, str): 212s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 212s parts = name.rsplit(".", 1) 212s if len(parts) == 2: 212s # called with 'foo.bar....' 212s package, obj = parts 212s > module = __import__(package, fromlist=[obj]) 212s E ModuleNotFoundError: No module named 'jupyter_server' 212s 212s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 212s 212s During handling of the above exception, another exception occurred: 212s 212s def test_nb_dir_root(): 212s root = os.path.abspath(os.sep) # gets the right value on Windows, Posix 212s > app = NotebookApp(notebook_dir=root) 212s 212s notebook/tests/test_notebookapp.py:79: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 212s inst.setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 212s init(self) 212s notebook/traittypes.py:226: in instance_init 212s self._resolve_classes() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s klass = self._resolve_string(klass) 212s self.importable_klasses.append(klass) 212s except: 212s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s notebook/traittypes.py:238: TypeError 212s _____________________________ test_generate_config _____________________________ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s > klass = self._resolve_string(klass) 212s 212s notebook/traittypes.py:235: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 212s return import_item(string) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s name = 'jupyter_server.contents.services.managers.ContentsManager' 212s 212s def import_item(name: str) -> Any: 212s """Import and return ``bar`` given the string ``foo.bar``. 212s 212s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 212s executing the code ``from foo import bar``. 212s 212s Parameters 212s ---------- 212s name : string 212s The fully qualified name of the module/package being imported. 212s 212s Returns 212s ------- 212s mod : module object 212s The module that was imported. 212s """ 212s if not isinstance(name, str): 212s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 212s parts = name.rsplit(".", 1) 212s if len(parts) == 2: 212s # called with 'foo.bar....' 212s package, obj = parts 212s > module = __import__(package, fromlist=[obj]) 212s E ModuleNotFoundError: No module named 'jupyter_server' 212s 212s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 212s 212s During handling of the above exception, another exception occurred: 212s 212s def test_generate_config(): 212s with TemporaryDirectory() as td: 212s > app = NotebookApp(config_dir=td) 212s 212s notebook/tests/test_notebookapp.py:84: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 212s inst.setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 212s init(self) 212s notebook/traittypes.py:226: in instance_init 212s self._resolve_classes() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s klass = self._resolve_string(klass) 212s self.importable_klasses.append(klass) 212s except: 212s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s notebook/traittypes.py:238: TypeError 212s ____________________________ test_notebook_password ____________________________ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s > klass = self._resolve_string(klass) 212s 212s notebook/traittypes.py:235: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 212s return import_item(string) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s name = 'jupyter_server.contents.services.managers.ContentsManager' 212s 212s def import_item(name: str) -> Any: 212s """Import and return ``bar`` given the string ``foo.bar``. 212s 212s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 212s executing the code ``from foo import bar``. 212s 212s Parameters 212s ---------- 212s name : string 212s The fully qualified name of the module/package being imported. 212s 212s Returns 212s ------- 212s mod : module object 212s The module that was imported. 212s """ 212s if not isinstance(name, str): 212s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 212s parts = name.rsplit(".", 1) 212s if len(parts) == 2: 212s # called with 'foo.bar....' 212s package, obj = parts 212s > module = __import__(package, fromlist=[obj]) 212s E ModuleNotFoundError: No module named 'jupyter_server' 212s 212s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 212s 212s During handling of the above exception, another exception occurred: 212s 212s def test_notebook_password(): 212s password = 'secret' 212s with TemporaryDirectory() as td: 212s with patch.dict('os.environ', { 212s 'JUPYTER_CONFIG_DIR': td, 212s }), patch.object(getpass, 'getpass', return_value=password): 212s app = notebookapp.NotebookPasswordApp(log_level=logging.ERROR) 212s app.initialize([]) 212s app.start() 212s > nb = NotebookApp() 212s 212s notebook/tests/test_notebookapp.py:133: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 212s inst.setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 212s init(self) 212s notebook/traittypes.py:226: in instance_init 212s self._resolve_classes() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s klass = self._resolve_string(klass) 212s self.importable_klasses.append(klass) 212s except: 212s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s notebook/traittypes.py:238: TypeError 212s _________________ TestInstallServerExtension.test_merge_config _________________ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s > klass = self._resolve_string(klass) 212s 212s notebook/traittypes.py:235: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 212s return import_item(string) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s name = 'jupyter_server.contents.services.managers.ContentsManager' 212s 212s def import_item(name: str) -> Any: 212s """Import and return ``bar`` given the string ``foo.bar``. 212s 212s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 212s executing the code ``from foo import bar``. 212s 212s Parameters 212s ---------- 212s name : string 212s The fully qualified name of the module/package being imported. 212s 212s Returns 212s ------- 212s mod : module object 212s The module that was imported. 212s """ 212s if not isinstance(name, str): 212s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 212s parts = name.rsplit(".", 1) 212s if len(parts) == 2: 212s # called with 'foo.bar....' 212s package, obj = parts 212s > module = __import__(package, fromlist=[obj]) 212s E ModuleNotFoundError: No module named 'jupyter_server' 212s 212s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 212s 212s During handling of the above exception, another exception occurred: 212s 212s self = 212s 212s def test_merge_config(self): 212s # enabled at sys level 212s mock_sys = self._inject_mock_extension('mockext_sys') 212s # enabled at sys, disabled at user 212s mock_both = self._inject_mock_extension('mockext_both') 212s # enabled at user 212s mock_user = self._inject_mock_extension('mockext_user') 212s # enabled at Python 212s mock_py = self._inject_mock_extension('mockext_py') 212s 212s toggle_serverextension_python('mockext_sys', enabled=True, user=False) 212s toggle_serverextension_python('mockext_user', enabled=True, user=True) 212s toggle_serverextension_python('mockext_both', enabled=True, user=False) 212s toggle_serverextension_python('mockext_both', enabled=False, user=True) 212s 212s > app = NotebookApp(nbserver_extensions={'mockext_py': True}) 212s 212s notebook/tests/test_serverextensions.py:147: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 212s inst.setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 212s init(self) 212s notebook/traittypes.py:226: in instance_init 212s self._resolve_classes() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s klass = self._resolve_string(klass) 212s self.importable_klasses.append(klass) 212s except: 212s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s notebook/traittypes.py:238: TypeError 212s _________________ TestOrderedServerExtension.test_load_ordered _________________ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s > klass = self._resolve_string(klass) 212s 212s notebook/traittypes.py:235: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:2015: in _resolve_string 212s return import_item(string) 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s name = 'jupyter_server.contents.services.managers.ContentsManager' 212s 212s def import_item(name: str) -> Any: 212s """Import and return ``bar`` given the string ``foo.bar``. 212s 212s Calling ``bar = import_item("foo.bar")`` is the functional equivalent of 212s executing the code ``from foo import bar``. 212s 212s Parameters 212s ---------- 212s name : string 212s The fully qualified name of the module/package being imported. 212s 212s Returns 212s ------- 212s mod : module object 212s The module that was imported. 212s """ 212s if not isinstance(name, str): 212s raise TypeError("import_item accepts strings, not '%s'." % type(name)) 212s parts = name.rsplit(".", 1) 212s if len(parts) == 2: 212s # called with 'foo.bar....' 212s package, obj = parts 212s > module = __import__(package, fromlist=[obj]) 212s E ModuleNotFoundError: No module named 'jupyter_server' 212s 212s /usr/lib/python3/dist-packages/traitlets/utils/importstring.py:33: ModuleNotFoundError 212s 212s During handling of the above exception, another exception occurred: 212s 212s self = 212s 212s def test_load_ordered(self): 212s > app = NotebookApp() 212s 212s notebook/tests/test_serverextensions.py:189: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1292: in __new__ 212s inst.setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1335: in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s /usr/lib/python3/dist-packages/traitlets/traitlets.py:1311: in setup_instance 212s init(self) 212s notebook/traittypes.py:226: in instance_init 212s self._resolve_classes() 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s self = 212s 212s def _resolve_classes(self): 212s # Resolve all string names to actual classes. 212s self.importable_klasses = [] 212s for klass in self.klasses: 212s if isinstance(klass, str): 212s try: 212s klass = self._resolve_string(klass) 212s self.importable_klasses.append(klass) 212s except: 212s > warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s notebook/traittypes.py:238: TypeError 212s _______________________________ test_help_output _______________________________ 212s 212s def test_help_output(): 212s """jupyter notebook --help-all works""" 212s # FIXME: will be notebook 212s > check_help_all_output('notebook') 212s 212s notebook/tests/test_utils.py:21: 212s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 212s 212s pkg = 'notebook', subcommand = None 212s 212s def check_help_all_output(pkg: str, subcommand: Sequence[str] | None = None) -> tuple[str, str]: 212s """test that `python -m PKG --help-all` works""" 212s cmd = [sys.executable, "-m", pkg] 212s if subcommand: 212s cmd.extend(subcommand) 212s cmd.append("--help-all") 212s out, err, rc = get_output_error_code(cmd) 212s > assert rc == 0, err 212s E AssertionError: Traceback (most recent call last): 212s E File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 235, in _resolve_classes 212s E klass = self._resolve_string(klass) 212s E ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s E File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 212s E return import_item(string) 212s E ^^^^^^^^^^^^^^^^^^^ 212s E File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 212s E module = __import__(package, fromlist=[obj]) 212s E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s E ModuleNotFoundError: No module named 'jupyter_server' 212s E 212s E During handling of the above exception, another exception occurred: 212s E 212s E Traceback (most recent call last): 212s E File "", line 198, in _run_module_as_main 212s E File "", line 88, in _run_code 212s E File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/__main__.py", line 3, in 212s E app.launch_new_instance() 212s E File "/usr/lib/python3/dist-packages/jupyter_core/application.py", line 282, in launch_instance 212s E super().launch_instance(argv=argv, **kwargs) 212s E File "/usr/lib/python3/dist-packages/traitlets/config/application.py", line 1073, in launch_instance 212s E app = cls.instance(**kwargs) 212s E ^^^^^^^^^^^^^^^^^^^^^^ 212s E File "/usr/lib/python3/dist-packages/traitlets/config/configurable.py", line 583, in instance 212s E inst = cls(*args, **kwargs) 212s E ^^^^^^^^^^^^^^^^^^^^ 212s E File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 212s E inst.setup_instance(*args, **kwargs) 212s E File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 212s E super(HasTraits, self).setup_instance(*args, **kwargs) 212s E File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 212s E init(self) 212s E File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 226, in instance_init 212s E self._resolve_classes() 212s E File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 238, in _resolve_classes 212s E warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s E TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s /usr/lib/python3/dist-packages/traitlets/tests/utils.py:38: AssertionError 212s =============================== warnings summary =============================== 212s notebook/nbextensions.py:15 212s /tmp/autopkgtest.s4beMp/build.uMz/src/notebook/nbextensions.py:15: DeprecationWarning: Jupyter is migrating its paths to use standard platformdirs 212s given by the platformdirs library. To remove this warning and 212s see the appropriate new directories, set the environment variable 212s `JUPYTER_PLATFORM_DIRS=1` and then run `jupyter --paths`. 212s The use of platformdirs will be the default in `jupyter_core` v6 212s from jupyter_core.paths import ( 212s 212s notebook/utils.py:280 212s notebook/utils.py:280 212s /tmp/autopkgtest.s4beMp/build.uMz/src/notebook/utils.py:280: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead. 212s return LooseVersion(v) >= LooseVersion(check) 212s 212s notebook/_tz.py:29: 1 warning 212s notebook/services/sessions/tests/test_sessionmanager.py: 9 warnings 212s /tmp/autopkgtest.s4beMp/build.uMz/src/notebook/_tz.py:29: DeprecationWarning: datetime.datetime.utcnow() is deprecated and scheduled for removal in a future version. Use timezone-aware objects to represent datetimes in UTC: datetime.datetime.now(datetime.UTC). 212s dt = unaware(*args, **kwargs) 212s 212s notebook/tests/test_notebookapp_integration.py:14 212s /tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/test_notebookapp_integration.py:14: PytestUnknownMarkWarning: Unknown pytest.mark.integration_tests - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html 212s pytestmark = pytest.mark.integration_tests 212s 212s notebook/auth/tests/test_login.py::LoginTest::test_next_bad 212s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-1 (start_thread) 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 235, in _resolve_classes 212s klass = self._resolve_string(klass) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 212s return import_item(string) 212s ^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 212s module = __import__(package, fromlist=[obj]) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s ModuleNotFoundError: No module named 'jupyter_server' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 155, in start_thread 212s app = cls.notebook = NotebookApp( 212s ^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 212s inst.setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 212s init(self) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 226, in instance_init 212s self._resolve_classes() 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 238, in _resolve_classes 212s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 212s self.run() 212s File "/usr/lib/python3.12/threading.py", line 1010, in run 212s self._target(*self._args, **self._kwargs) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 193, in start_thread 212s app.session_manager.close() 212s ^^^ 212s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 212s 212s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 212s 212s notebook/bundler/tests/test_bundler_api.py::BundleAPITest::test_bundler_import_error 212s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-2 (start_thread) 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 235, in _resolve_classes 212s klass = self._resolve_string(klass) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 212s return import_item(string) 212s ^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 212s module = __import__(package, fromlist=[obj]) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s ModuleNotFoundError: No module named 'jupyter_server' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 155, in start_thread 212s app = cls.notebook = NotebookApp( 212s ^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 212s inst.setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 212s init(self) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 226, in instance_init 212s self._resolve_classes() 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 238, in _resolve_classes 212s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 212s self.run() 212s File "/usr/lib/python3.12/threading.py", line 1010, in run 212s self._target(*self._args, **self._kwargs) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 193, in start_thread 212s app.session_manager.close() 212s ^^^ 212s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 212s 212s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 212s 212s notebook/services/api/tests/test_api.py::APITest::test_get_spec 212s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-3 (start_thread) 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 235, in _resolve_classes 212s klass = self._resolve_string(klass) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 212s return import_item(string) 212s ^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 212s module = __import__(package, fromlist=[obj]) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s ModuleNotFoundError: No module named 'jupyter_server' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 155, in start_thread 212s app = cls.notebook = NotebookApp( 212s ^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 212s inst.setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 212s init(self) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 226, in instance_init 212s self._resolve_classes() 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 238, in _resolve_classes 212s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 212s self.run() 212s File "/usr/lib/python3.12/threading.py", line 1010, in run 212s self._target(*self._args, **self._kwargs) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 193, in start_thread 212s app.session_manager.close() 212s ^^^ 212s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 212s 212s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 212s 212s notebook/services/config/tests/test_config_api.py::APITest::test_create_retrieve_config 212s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-4 (start_thread) 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 235, in _resolve_classes 212s klass = self._resolve_string(klass) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 212s return import_item(string) 212s ^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 212s module = __import__(package, fromlist=[obj]) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s ModuleNotFoundError: No module named 'jupyter_server' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 155, in start_thread 212s app = cls.notebook = NotebookApp( 212s ^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 212s inst.setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 212s init(self) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 226, in instance_init 212s self._resolve_classes() 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 238, in _resolve_classes 212s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 212s self.run() 212s File "/usr/lib/python3.12/threading.py", line 1010, in run 212s self._target(*self._args, **self._kwargs) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 193, in start_thread 212s app.session_manager.close() 212s ^^^ 212s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 212s 212s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 212s 212s notebook/services/contents/tests/test_contents_api.py::APITest::test_checkpoints 212s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-5 (start_thread) 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 235, in _resolve_classes 212s klass = self._resolve_string(klass) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 212s return import_item(string) 212s ^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 212s module = __import__(package, fromlist=[obj]) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s ModuleNotFoundError: No module named 'jupyter_server' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 155, in start_thread 212s app = cls.notebook = NotebookApp( 212s ^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 212s inst.setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 212s init(self) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 226, in instance_init 212s self._resolve_classes() 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 238, in _resolve_classes 212s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 212s self.run() 212s File "/usr/lib/python3.12/threading.py", line 1010, in run 212s self._target(*self._args, **self._kwargs) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 193, in start_thread 212s app.session_manager.close() 212s ^^^ 212s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 212s 212s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 212s 212s notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_checkpoints 212s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-6 (start_thread) 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 235, in _resolve_classes 212s klass = self._resolve_string(klass) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 212s return import_item(string) 212s ^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 212s module = __import__(package, fromlist=[obj]) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s ModuleNotFoundError: No module named 'jupyter_server' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 155, in start_thread 212s app = cls.notebook = NotebookApp( 212s ^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 212s inst.setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 212s init(self) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 226, in instance_init 212s self._resolve_classes() 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 238, in _resolve_classes 212s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 212s self.run() 212s File "/usr/lib/python3.12/threading.py", line 1010, in run 212s self._target(*self._args, **self._kwargs) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 193, in start_thread 212s app.session_manager.close() 212s ^^^ 212s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 212s 212s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 212s 212s notebook/services/contents/tests/test_largefilemanager.py: 42 warnings 212s notebook/services/contents/tests/test_manager.py: 526 warnings 212s /tmp/autopkgtest.s4beMp/build.uMz/src/notebook/_tz.py:29: DeprecationWarning: datetime.datetime.utcfromtimestamp() is deprecated and scheduled for removal in a future version. Use timezone-aware objects to represent datetimes in UTC: datetime.datetime.fromtimestamp(timestamp, datetime.UTC). 212s dt = unaware(*args, **kwargs) 212s 212s notebook/services/kernels/tests/test_kernels_api.py::KernelAPITest::test_connections 212s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-7 (start_thread) 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 235, in _resolve_classes 212s klass = self._resolve_string(klass) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 212s return import_item(string) 212s ^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 212s module = __import__(package, fromlist=[obj]) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s ModuleNotFoundError: No module named 'jupyter_server' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 155, in start_thread 212s app = cls.notebook = NotebookApp( 212s ^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 212s inst.setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 212s init(self) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 226, in instance_init 212s self._resolve_classes() 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 238, in _resolve_classes 212s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 212s self.run() 212s File "/usr/lib/python3.12/threading.py", line 1010, in run 212s self._target(*self._args, **self._kwargs) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 193, in start_thread 212s app.session_manager.close() 212s ^^^ 212s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 212s 212s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 212s 212s notebook/services/kernels/tests/test_kernels_api.py::AsyncKernelAPITest::test_connections 212s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-8 (start_thread) 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 235, in _resolve_classes 212s klass = self._resolve_string(klass) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 212s return import_item(string) 212s ^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 212s module = __import__(package, fromlist=[obj]) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s ModuleNotFoundError: No module named 'jupyter_server' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 155, in start_thread 212s app = cls.notebook = NotebookApp( 212s ^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 212s inst.setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 212s init(self) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 226, in instance_init 212s self._resolve_classes() 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 238, in _resolve_classes 212s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 212s self.run() 212s File "/usr/lib/python3.12/threading.py", line 1010, in run 212s self._target(*self._args, **self._kwargs) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 193, in start_thread 212s app.session_manager.close() 212s ^^^ 212s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 212s 212s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 212s 212s notebook/services/kernels/tests/test_kernels_api.py::KernelFilterTest::test_config 212s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-9 (start_thread) 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 235, in _resolve_classes 212s klass = self._resolve_string(klass) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 212s return import_item(string) 212s ^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 212s module = __import__(package, fromlist=[obj]) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s ModuleNotFoundError: No module named 'jupyter_server' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 155, in start_thread 212s app = cls.notebook = NotebookApp( 212s ^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 212s inst.setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 212s init(self) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 226, in instance_init 212s self._resolve_classes() 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 238, in _resolve_classes 212s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 212s self.run() 212s File "/usr/lib/python3.12/threading.py", line 1010, in run 212s self._target(*self._args, **self._kwargs) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 193, in start_thread 212s app.session_manager.close() 212s ^^^ 212s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 212s 212s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 212s 212s notebook/services/kernels/tests/test_kernels_api.py::KernelCullingTest::test_culling 212s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-10 (start_thread) 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 235, in _resolve_classes 212s klass = self._resolve_string(klass) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 212s return import_item(string) 212s ^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 212s module = __import__(package, fromlist=[obj]) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s ModuleNotFoundError: No module named 'jupyter_server' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 155, in start_thread 212s app = cls.notebook = NotebookApp( 212s ^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 212s inst.setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 212s init(self) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 226, in instance_init 212s self._resolve_classes() 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 238, in _resolve_classes 212s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 212s self.run() 212s File "/usr/lib/python3.12/threading.py", line 1010, in run 212s self._target(*self._args, **self._kwargs) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 193, in start_thread 212s app.session_manager.close() 212s ^^^ 212s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 212s 212s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 212s 212s notebook/services/kernelspecs/tests/test_kernelspecs_api.py::APITest::test_get_kernel_resource_file 212s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-11 (start_thread) 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 235, in _resolve_classes 212s klass = self._resolve_string(klass) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 212s return import_item(string) 212s ^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 212s module = __import__(package, fromlist=[obj]) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s ModuleNotFoundError: No module named 'jupyter_server' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 155, in start_thread 212s app = cls.notebook = NotebookApp( 212s ^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 212s inst.setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 212s init(self) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 226, in instance_init 212s self._resolve_classes() 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 238, in _resolve_classes 212s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 212s self.run() 212s File "/usr/lib/python3.12/threading.py", line 1010, in run 212s self._target(*self._args, **self._kwargs) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 193, in start_thread 212s app.session_manager.close() 212s ^^^ 212s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 212s 212s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 212s 212s notebook/services/nbconvert/tests/test_nbconvert_api.py::APITest::test_list_formats 212s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-12 (start_thread) 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 235, in _resolve_classes 212s klass = self._resolve_string(klass) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 212s return import_item(string) 212s ^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 212s module = __import__(package, fromlist=[obj]) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s ModuleNotFoundError: No module named 'jupyter_server' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 155, in start_thread 212s app = cls.notebook = NotebookApp( 212s ^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 212s inst.setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 212s init(self) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 226, in instance_init 212s self._resolve_classes() 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 238, in _resolve_classes 212s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 212s self.run() 212s File "/usr/lib/python3.12/threading.py", line 1010, in run 212s self._target(*self._args, **self._kwargs) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 193, in start_thread 212s app.session_manager.close() 212s ^^^ 212s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 212s 212s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 212s 212s notebook/services/sessions/tests/test_sessions_api.py::SessionAPITest::test_create 212s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-13 (start_thread) 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 235, in _resolve_classes 212s klass = self._resolve_string(klass) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 212s return import_item(string) 212s ^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 212s module = __import__(package, fromlist=[obj]) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s ModuleNotFoundError: No module named 'jupyter_server' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 155, in start_thread 212s app = cls.notebook = NotebookApp( 212s ^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 212s inst.setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 212s init(self) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 226, in instance_init 212s self._resolve_classes() 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 238, in _resolve_classes 212s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 212s self.run() 212s File "/usr/lib/python3.12/threading.py", line 1010, in run 212s self._target(*self._args, **self._kwargs) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 193, in start_thread 212s app.session_manager.close() 212s ^^^ 212s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 212s 212s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 212s 212s notebook/services/sessions/tests/test_sessions_api.py::AsyncSessionAPITest::test_create 212s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-14 (start_thread) 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 235, in _resolve_classes 212s klass = self._resolve_string(klass) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 212s return import_item(string) 212s ^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 212s module = __import__(package, fromlist=[obj]) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s ModuleNotFoundError: No module named 'jupyter_server' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 155, in start_thread 212s app = cls.notebook = NotebookApp( 212s ^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 212s inst.setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 212s init(self) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 226, in instance_init 212s self._resolve_classes() 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 238, in _resolve_classes 212s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 212s self.run() 212s File "/usr/lib/python3.12/threading.py", line 1010, in run 212s self._target(*self._args, **self._kwargs) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 193, in start_thread 212s app.session_manager.close() 212s ^^^ 212s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 212s 212s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 212s 212s notebook/terminal/tests/test_terminals_api.py::TerminalAPITest::test_create_terminal 212s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-15 (start_thread) 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 235, in _resolve_classes 212s klass = self._resolve_string(klass) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 212s return import_item(string) 212s ^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 212s module = __import__(package, fromlist=[obj]) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s ModuleNotFoundError: No module named 'jupyter_server' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 155, in start_thread 212s app = cls.notebook = NotebookApp( 212s ^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 212s inst.setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 212s init(self) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 226, in instance_init 212s self._resolve_classes() 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 238, in _resolve_classes 212s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 212s self.run() 212s File "/usr/lib/python3.12/threading.py", line 1010, in run 212s self._target(*self._args, **self._kwargs) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 193, in start_thread 212s app.session_manager.close() 212s ^^^ 212s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 212s 212s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 212s 212s notebook/terminal/tests/test_terminals_api.py::TerminalCullingTest::test_config 212s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-16 (start_thread) 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 235, in _resolve_classes 212s klass = self._resolve_string(klass) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 212s return import_item(string) 212s ^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 212s module = __import__(package, fromlist=[obj]) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s ModuleNotFoundError: No module named 'jupyter_server' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 155, in start_thread 212s app = cls.notebook = NotebookApp( 212s ^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 212s inst.setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 212s init(self) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 226, in instance_init 212s self._resolve_classes() 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 238, in _resolve_classes 212s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 212s self.run() 212s File "/usr/lib/python3.12/threading.py", line 1010, in run 212s self._target(*self._args, **self._kwargs) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 193, in start_thread 212s app.session_manager.close() 212s ^^^ 212s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 212s 212s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 212s 212s notebook/tests/test_files.py::FilesTest::test_contents_manager 212s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-17 (start_thread) 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 235, in _resolve_classes 212s klass = self._resolve_string(klass) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 212s return import_item(string) 212s ^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 212s module = __import__(package, fromlist=[obj]) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s ModuleNotFoundError: No module named 'jupyter_server' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 155, in start_thread 212s app = cls.notebook = NotebookApp( 212s ^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 212s inst.setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 212s init(self) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 226, in instance_init 212s self._resolve_classes() 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 238, in _resolve_classes 212s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 212s self.run() 212s File "/usr/lib/python3.12/threading.py", line 1010, in run 212s self._target(*self._args, **self._kwargs) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 193, in start_thread 212s app.session_manager.close() 212s ^^^ 212s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 212s 212s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 212s 212s notebook/tests/test_gateway.py::TestGateway::test_gateway_class_mappings 212s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-18 (start_thread) 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 235, in _resolve_classes 212s klass = self._resolve_string(klass) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 212s return import_item(string) 212s ^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 212s module = __import__(package, fromlist=[obj]) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s ModuleNotFoundError: No module named 'jupyter_server' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 155, in start_thread 212s app = cls.notebook = NotebookApp( 212s ^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 212s inst.setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 212s init(self) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 226, in instance_init 212s self._resolve_classes() 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 238, in _resolve_classes 212s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 212s self.run() 212s File "/usr/lib/python3.12/threading.py", line 1010, in run 212s self._target(*self._args, **self._kwargs) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 193, in start_thread 212s app.session_manager.close() 212s ^^^ 212s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 212s 212s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 212s 212s notebook/tests/test_nbextensions.py::TestInstallNBExtension::test_install_tar 212s notebook/tests/test_nbextensions.py::TestInstallNBExtension::test_install_tar 212s notebook/tests/test_nbextensions.py::TestInstallNBExtension::test_install_tar 212s /tmp/autopkgtest.s4beMp/build.uMz/src/notebook/nbextensions.py:154: DeprecationWarning: Python 3.14 will, by default, filter extracted tar archives and reject files or modify their metadata. Use the filter argument to control this behavior. 212s archive.extractall(nbext) 212s 212s notebook/tests/test_notebookapp.py::NotebookAppTests::test_list_running_servers 212s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-19 (start_thread) 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 235, in _resolve_classes 212s klass = self._resolve_string(klass) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 212s return import_item(string) 212s ^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 212s module = __import__(package, fromlist=[obj]) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s ModuleNotFoundError: No module named 'jupyter_server' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 155, in start_thread 212s app = cls.notebook = NotebookApp( 212s ^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 212s inst.setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 212s init(self) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 226, in instance_init 212s self._resolve_classes() 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 238, in _resolve_classes 212s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 212s self.run() 212s File "/usr/lib/python3.12/threading.py", line 1010, in run 212s self._target(*self._args, **self._kwargs) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 193, in start_thread 212s app.session_manager.close() 212s ^^^ 212s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 212s 212s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 212s 212s notebook/tests/test_notebookapp.py::NotebookUnixSocketTests::test_list_running_sock_servers 212s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-20 (start_thread) 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 235, in _resolve_classes 212s klass = self._resolve_string(klass) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 212s return import_item(string) 212s ^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 212s module = __import__(package, fromlist=[obj]) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s ModuleNotFoundError: No module named 'jupyter_server' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 155, in start_thread 212s app = cls.notebook = NotebookApp( 212s ^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 212s inst.setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 212s init(self) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 226, in instance_init 212s self._resolve_classes() 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 238, in _resolve_classes 212s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 212s self.run() 212s File "/usr/lib/python3.12/threading.py", line 1010, in run 212s self._target(*self._args, **self._kwargs) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 193, in start_thread 212s app.session_manager.close() 212s ^^^ 212s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 212s 212s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 212s 212s notebook/tests/test_notebookapp.py::NotebookAppJSONLoggingTests::test_log_json_enabled 212s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-21 (start_thread) 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 235, in _resolve_classes 212s klass = self._resolve_string(klass) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 212s return import_item(string) 212s ^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 212s module = __import__(package, fromlist=[obj]) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s ModuleNotFoundError: No module named 'jupyter_server' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 155, in start_thread 212s app = cls.notebook = NotebookApp( 212s ^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 212s inst.setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 212s init(self) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 226, in instance_init 212s self._resolve_classes() 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 238, in _resolve_classes 212s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 212s self.run() 212s File "/usr/lib/python3.12/threading.py", line 1010, in run 212s self._target(*self._args, **self._kwargs) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 193, in start_thread 212s app.session_manager.close() 212s ^^^ 212s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 212s 212s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 212s 212s notebook/tests/test_paths.py::RedirectTestCase::test_trailing_slash 212s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-22 (start_thread) 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 235, in _resolve_classes 212s klass = self._resolve_string(klass) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 212s return import_item(string) 212s ^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 212s module = __import__(package, fromlist=[obj]) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s ModuleNotFoundError: No module named 'jupyter_server' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 155, in start_thread 212s app = cls.notebook = NotebookApp( 212s ^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 212s inst.setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 212s init(self) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 226, in instance_init 212s self._resolve_classes() 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 238, in _resolve_classes 212s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 212s self.run() 212s File "/usr/lib/python3.12/threading.py", line 1010, in run 212s self._target(*self._args, **self._kwargs) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 193, in start_thread 212s app.session_manager.close() 212s ^^^ 212s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 212s 212s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 212s 212s notebook/tree/tests/test_tree_handler.py::TreeTest::test_redirect 212s /usr/lib/python3/dist-packages/_pytest/threadexception.py:73: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-23 (start_thread) 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 235, in _resolve_classes 212s klass = self._resolve_string(klass) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 2015, in _resolve_string 212s return import_item(string) 212s ^^^^^^^^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/utils/importstring.py", line 33, in import_item 212s module = __import__(package, fromlist=[obj]) 212s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 212s ModuleNotFoundError: No module named 'jupyter_server' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 155, in start_thread 212s app = cls.notebook = NotebookApp( 212s ^^^^^^^^^^^^ 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1292, in __new__ 212s inst.setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1335, in setup_instance 212s super(HasTraits, self).setup_instance(*args, **kwargs) 212s File "/usr/lib/python3/dist-packages/traitlets/traitlets.py", line 1311, in setup_instance 212s init(self) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 226, in instance_init 212s self._resolve_classes() 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/traittypes.py", line 238, in _resolve_classes 212s warn(f"{klass} is not importable. Is it installed?", ImportWarning) 212s TypeError: warn() missing 1 required keyword-only argument: 'stacklevel' 212s 212s During handling of the above exception, another exception occurred: 212s 212s Traceback (most recent call last): 212s File "/usr/lib/python3.12/threading.py", line 1073, in _bootstrap_inner 212s self.run() 212s File "/usr/lib/python3.12/threading.py", line 1010, in run 212s self._target(*self._args, **self._kwargs) 212s File "/tmp/autopkgtest.s4beMp/build.uMz/src/notebook/tests/launchnotebook.py", line 193, in start_thread 212s app.session_manager.close() 212s ^^^ 212s UnboundLocalError: cannot access local variable 'app' where it is not associated with a value 212s 212s warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) 212s 212s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 212s =========================== short test summary info ============================ 212s FAILED notebook/services/sessions/tests/test_sessionmanager.py::TestSessionManager::test_bad_delete_session 212s FAILED notebook/services/sessions/tests/test_sessionmanager.py::TestSessionManager::test_bad_get_session 212s FAILED notebook/services/sessions/tests/test_sessionmanager.py::TestSessionManager::test_bad_update_session 212s FAILED notebook/services/sessions/tests/test_sessionmanager.py::TestSessionManager::test_delete_session 212s FAILED notebook/services/sessions/tests/test_sessionmanager.py::TestSessionManager::test_get_session 212s FAILED notebook/services/sessions/tests/test_sessionmanager.py::TestSessionManager::test_get_session_dead_kernel 212s FAILED notebook/services/sessions/tests/test_sessionmanager.py::TestSessionManager::test_list_sessions 212s FAILED notebook/services/sessions/tests/test_sessionmanager.py::TestSessionManager::test_list_sessions_dead_kernel 212s FAILED notebook/services/sessions/tests/test_sessionmanager.py::TestSessionManager::test_update_session 212s FAILED notebook/tests/test_notebookapp.py::test_help_output - AssertionError:... 212s FAILED notebook/tests/test_notebookapp.py::test_server_info_file - TypeError:... 212s FAILED notebook/tests/test_notebookapp.py::test_nb_dir - TypeError: warn() mi... 212s FAILED notebook/tests/test_notebookapp.py::test_no_create_nb_dir - TypeError:... 212s FAILED notebook/tests/test_notebookapp.py::test_missing_nb_dir - TypeError: w... 212s FAILED notebook/tests/test_notebookapp.py::test_invalid_nb_dir - TypeError: w... 212s FAILED notebook/tests/test_notebookapp.py::test_nb_dir_with_slash - TypeError... 212s FAILED notebook/tests/test_notebookapp.py::test_nb_dir_root - TypeError: warn... 212s FAILED notebook/tests/test_notebookapp.py::test_generate_config - TypeError: ... 212s FAILED notebook/tests/test_notebookapp.py::test_notebook_password - TypeError... 212s FAILED notebook/tests/test_serverextensions.py::TestInstallServerExtension::test_merge_config 212s FAILED notebook/tests/test_serverextensions.py::TestOrderedServerExtension::test_load_ordered 212s FAILED notebook/tests/test_utils.py::test_help_output - AssertionError: Trace... 212s ERROR notebook/auth/tests/test_login.py::LoginTest::test_next_bad - RuntimeEr... 212s ERROR notebook/auth/tests/test_login.py::LoginTest::test_next_ok - RuntimeErr... 212s ERROR notebook/bundler/tests/test_bundler_api.py::BundleAPITest::test_bundler_import_error 212s ERROR notebook/bundler/tests/test_bundler_api.py::BundleAPITest::test_bundler_invoke 212s ERROR notebook/bundler/tests/test_bundler_api.py::BundleAPITest::test_bundler_not_enabled 212s ERROR notebook/bundler/tests/test_bundler_api.py::BundleAPITest::test_missing_bundler_arg 212s ERROR notebook/bundler/tests/test_bundler_api.py::BundleAPITest::test_notebook_not_found 212s ERROR notebook/services/api/tests/test_api.py::APITest::test_get_spec - Runti... 212s ERROR notebook/services/api/tests/test_api.py::APITest::test_get_status - Run... 212s ERROR notebook/services/api/tests/test_api.py::APITest::test_no_track_activity 212s ERROR notebook/services/config/tests/test_config_api.py::APITest::test_create_retrieve_config 212s ERROR notebook/services/config/tests/test_config_api.py::APITest::test_get_unknown 212s ERROR notebook/services/config/tests/test_config_api.py::APITest::test_modify 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_checkpoints 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_checkpoints_separate_root 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_copy 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_copy_400_hidden 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_copy_copy 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_copy_dir_400 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_copy_path 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_copy_put_400 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_copy_put_400_hidden 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_create_untitled 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_create_untitled_txt 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_delete_hidden_dir 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_delete_hidden_file 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_file_checkpoints 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_get_404_hidden 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_get_bad_type 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_get_binary_file_contents 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_get_contents_no_such_file 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_get_dir_no_content 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_get_nb_contents 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_get_nb_invalid 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_get_nb_no_content 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_get_text_file_contents 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_list_dirs 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_list_nonexistant_dir 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_list_notebooks 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_mkdir 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_mkdir_hidden_400 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_mkdir_untitled 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_rename 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_rename_400_hidden 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_rename_existing 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_save 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_upload 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_upload_b64 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_upload_txt 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_upload_txt_hidden 212s ERROR notebook/services/contents/tests/test_contents_api.py::APITest::test_upload_v2 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_checkpoints 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_checkpoints_separate_root 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_config_did_something 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_copy 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_copy_400_hidden 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_copy_copy 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_copy_dir_400 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_copy_path 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_copy_put_400 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_copy_put_400_hidden 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_create_untitled 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_create_untitled_txt 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_delete_hidden_dir 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_delete_hidden_file 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_file_checkpoints 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_get_404_hidden 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_get_bad_type 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_get_binary_file_contents 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_get_contents_no_such_file 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_get_dir_no_content 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_get_nb_contents 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_get_nb_invalid 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_get_nb_no_content 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_get_text_file_contents 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_list_dirs 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_list_nonexistant_dir 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_list_notebooks 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_mkdir 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_mkdir_hidden_400 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_mkdir_untitled 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_rename 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_rename_400_hidden 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_rename_existing 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_save 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_upload 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_upload_b64 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_upload_txt 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_upload_txt_hidden 212s ERROR notebook/services/contents/tests/test_contents_api.py::GenericFileCheckpointsAPITest::test_upload_v2 212s ERROR notebook/services/kernels/tests/test_kernels_api.py::KernelAPITest::test_connections 212s ERROR notebook/services/kernels/tests/test_kernels_api.py::KernelAPITest::test_default_kernel 212s ERROR notebook/services/kernels/tests/test_kernels_api.py::KernelAPITest::test_kernel_handler 212s ERROR notebook/services/kernels/tests/test_kernels_api.py::KernelAPITest::test_main_kernel_handler 212s ERROR notebook/services/kernels/tests/test_kernels_api.py::KernelAPITest::test_no_kernels 212s ERROR notebook/services/kernels/tests/test_kernels_api.py::AsyncKernelAPITest::test_connections 212s ERROR notebook/services/kernels/tests/test_kernels_api.py::AsyncKernelAPITest::test_default_kernel 212s ERROR notebook/services/kernels/tests/test_kernels_api.py::AsyncKernelAPITest::test_kernel_handler 212s ERROR notebook/services/kernels/tests/test_kernels_api.py::AsyncKernelAPITest::test_main_kernel_handler 212s ERROR notebook/services/kernels/tests/test_kernels_api.py::AsyncKernelAPITest::test_no_kernels 212s ERROR notebook/services/kernels/tests/test_kernels_api.py::KernelFilterTest::test_config 212s ERROR notebook/services/kernels/tests/test_kernels_api.py::KernelCullingTest::test_culling 212s ERROR notebook/services/kernelspecs/tests/test_kernelspecs_api.py::APITest::test_get_kernel_resource_file 212s ERROR notebook/services/kernelspecs/tests/test_kernelspecs_api.py::APITest::test_get_kernelspec 212s ERROR notebook/services/kernelspecs/tests/test_kernelspecs_api.py::APITest::test_get_kernelspec_spaces 212s ERROR notebook/services/kernelspecs/tests/test_kernelspecs_api.py::APITest::test_get_nonexistant_kernelspec 212s ERROR notebook/services/kernelspecs/tests/test_kernelspecs_api.py::APITest::test_get_nonexistant_resource 212s ERROR notebook/services/kernelspecs/tests/test_kernelspecs_api.py::APITest::test_list_kernelspecs 212s ERROR notebook/services/kernelspecs/tests/test_kernelspecs_api.py::APITest::test_list_kernelspecs_bad 212s ERROR notebook/services/nbconvert/tests/test_nbconvert_api.py::APITest::test_list_formats 212s ERROR notebook/services/sessions/tests/test_sessions_api.py::SessionAPITest::test_create 212s ERROR notebook/services/sessions/tests/test_sessions_api.py::SessionAPITest::test_create_console_session 212s ERROR notebook/services/sessions/tests/test_sessions_api.py::SessionAPITest::test_create_deprecated 212s ERROR notebook/services/sessions/tests/test_sessions_api.py::SessionAPITest::test_create_file_session 212s ERROR notebook/services/sessions/tests/test_sessions_api.py::SessionAPITest::test_create_with_kernel_id 212s ERROR notebook/services/sessions/tests/test_sessions_api.py::SessionAPITest::test_delete 212s ERROR notebook/services/sessions/tests/test_sessions_api.py::SessionAPITest::test_modify_kernel_id 212s ERROR notebook/services/sessions/tests/test_sessions_api.py::SessionAPITest::test_modify_kernel_name 212s ERROR notebook/services/sessions/tests/test_sessions_api.py::SessionAPITest::test_modify_path 212s ERROR notebook/services/sessions/tests/test_sessions_api.py::SessionAPITest::test_modify_path_deprecated 212s ERROR notebook/services/sessions/tests/test_sessions_api.py::SessionAPITest::test_modify_type 212s ERROR notebook/services/sessions/tests/test_sessions_api.py::AsyncSessionAPITest::test_create 212s ERROR notebook/services/sessions/tests/test_sessions_api.py::AsyncSessionAPITest::test_create_console_session 212s ERROR notebook/services/sessions/tests/test_sessions_api.py::AsyncSessionAPITest::test_create_deprecated 212s ERROR notebook/services/sessions/tests/test_sessions_api.py::AsyncSessionAPITest::test_create_file_session 212s ERROR notebook/services/sessions/tests/test_sessions_api.py::AsyncSessionAPITest::test_create_with_kernel_id 212s ERROR notebook/services/sessions/tests/test_sessions_api.py::AsyncSessionAPITest::test_delete 212s ERROR notebook/services/sessions/tests/test_sessions_api.py::AsyncSessionAPITest::test_modify_kernel_id 212s ERROR notebook/services/sessions/tests/test_sessions_api.py::AsyncSessionAPITest::test_modify_kernel_name 212s ERROR notebook/services/sessions/tests/test_sessions_api.py::AsyncSessionAPITest::test_modify_path 212s ERROR notebook/services/sessions/tests/test_sessions_api.py::AsyncSessionAPITest::test_modify_path_deprecated 212s ERROR notebook/services/sessions/tests/test_sessions_api.py::AsyncSessionAPITest::test_modify_type 212s ERROR notebook/terminal/tests/test_terminals_api.py::TerminalAPITest::test_create_terminal 212s ERROR notebook/terminal/tests/test_terminals_api.py::TerminalAPITest::test_create_terminal_via_get 212s ERROR notebook/terminal/tests/test_terminals_api.py::TerminalAPITest::test_create_terminal_with_name 212s ERROR notebook/terminal/tests/test_terminals_api.py::TerminalAPITest::test_no_terminals 212s ERROR notebook/terminal/tests/test_terminals_api.py::TerminalAPITest::test_terminal_handler 212s ERROR notebook/terminal/tests/test_terminals_api.py::TerminalAPITest::test_terminal_root_handler 212s ERROR notebook/terminal/tests/test_terminals_api.py::TerminalCullingTest::test_config 212s ERROR notebook/terminal/tests/test_terminals_api.py::TerminalCullingTest::test_culling 212s ERROR notebook/tests/test_files.py::FilesTest::test_contents_manager - Runtim... 212s ERROR notebook/tests/test_files.py::FilesTest::test_download - RuntimeError: ... 212s ERROR notebook/tests/test_files.py::FilesTest::test_hidden_files - RuntimeErr... 212s ERROR notebook/tests/test_files.py::FilesTest::test_old_files_redirect - Runt... 212s ERROR notebook/tests/test_files.py::FilesTest::test_view_html - RuntimeError:... 212s ERROR notebook/tests/test_gateway.py::TestGateway::test_gateway_class_mappings 212s ERROR notebook/tests/test_gateway.py::TestGateway::test_gateway_get_kernelspecs 212s ERROR notebook/tests/test_gateway.py::TestGateway::test_gateway_get_named_kernelspec 212s ERROR notebook/tests/test_gateway.py::TestGateway::test_gateway_kernel_lifecycle 212s ERROR notebook/tests/test_gateway.py::TestGateway::test_gateway_options - Run... 212s ERROR notebook/tests/test_gateway.py::TestGateway::test_gateway_session_lifecycle 212s ERROR notebook/tests/test_notebookapp.py::NotebookAppTests::test_list_running_servers 212s ERROR notebook/tests/test_notebookapp.py::NotebookAppTests::test_log_json_default 212s ERROR notebook/tests/test_notebookapp.py::NotebookAppTests::test_validate_log_json 212s ERROR notebook/tests/test_notebookapp.py::NotebookUnixSocketTests::test_list_running_sock_servers 212s ERROR notebook/tests/test_notebookapp.py::NotebookUnixSocketTests::test_run 212s ERROR notebook/tests/test_notebookapp.py::NotebookAppJSONLoggingTests::test_log_json_enabled 212s ERROR notebook/tests/test_notebookapp.py::NotebookAppJSONLoggingTests::test_validate_log_json 212s ERROR notebook/tests/test_paths.py::RedirectTestCase::test_trailing_slash - R... 212s ERROR notebook/tree/tests/test_tree_handler.py::TreeTest::test_redirect - Run... 212s = 22 failed, 123 passed, 20 skipped, 5 deselected, 608 warnings, 160 errors in 36.46s = 212s autopkgtest [10:27:40]: test pytest: -----------------------] 213s pytest FAIL non-zero exit status 1 213s autopkgtest [10:27:41]: test pytest: - - - - - - - - - - results - - - - - - - - - - 213s autopkgtest [10:27:41]: test command1: preparing testbed 319s autopkgtest [10:29:27]: testbed dpkg architecture: s390x 319s autopkgtest [10:29:27]: testbed apt version: 2.9.5 319s autopkgtest [10:29:27]: @@@@@@@@@@@@@@@@@@@@ test bed setup 320s Get:1 http://ftpmaster.internal/ubuntu oracular-proposed InRelease [110 kB] 320s Get:2 http://ftpmaster.internal/ubuntu oracular-proposed/universe Sources [389 kB] 320s Get:3 http://ftpmaster.internal/ubuntu oracular-proposed/multiverse Sources [2576 B] 320s Get:4 http://ftpmaster.internal/ubuntu oracular-proposed/main Sources [36.1 kB] 320s Get:5 http://ftpmaster.internal/ubuntu oracular-proposed/restricted Sources [7052 B] 320s Get:6 http://ftpmaster.internal/ubuntu oracular-proposed/main s390x Packages [43.9 kB] 320s Get:7 http://ftpmaster.internal/ubuntu oracular-proposed/restricted s390x Packages [1860 B] 320s Get:8 http://ftpmaster.internal/ubuntu oracular-proposed/universe s390x Packages [298 kB] 320s Get:9 http://ftpmaster.internal/ubuntu oracular-proposed/multiverse s390x Packages [2528 B] 321s Fetched 892 kB in 1s (1161 kB/s) 321s Reading package lists... 323s Reading package lists... 323s Building dependency tree... 323s Reading state information... 323s Calculating upgrade... 323s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 323s Reading package lists... 324s Building dependency tree... 324s Reading state information... 324s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 324s Hit:1 http://ftpmaster.internal/ubuntu oracular-proposed InRelease 324s Get:2 http://ftpmaster.internal/ubuntu oracular InRelease [110 kB] 325s Hit:3 http://ftpmaster.internal/ubuntu oracular-updates InRelease 325s Hit:4 http://ftpmaster.internal/ubuntu oracular-security InRelease 325s Get:5 http://ftpmaster.internal/ubuntu oracular/universe Sources [20.1 MB] 326s Get:6 http://ftpmaster.internal/ubuntu oracular/main Sources [1384 kB] 326s Get:7 http://ftpmaster.internal/ubuntu oracular/main s390x Packages [1336 kB] 326s Get:8 http://ftpmaster.internal/ubuntu oracular/universe s390x Packages [14.9 MB] 331s Fetched 37.8 MB in 7s (5394 kB/s) 332s Reading package lists... 332s Reading package lists... 332s Building dependency tree... 332s Reading state information... 333s Calculating upgrade... 333s The following packages will be upgraded: 333s libldap-common libldap2 333s 2 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 333s Need to get 230 kB of archives. 333s After this operation, 16.4 kB disk space will be freed. 333s Get:1 http://ftpmaster.internal/ubuntu oracular/main s390x libldap-common all 2.6.7+dfsg-1~exp1ubuntu9 [31.5 kB] 333s Get:2 http://ftpmaster.internal/ubuntu oracular/main s390x libldap2 s390x 2.6.7+dfsg-1~exp1ubuntu9 [199 kB] 334s Fetched 230 kB in 0s (548 kB/s) 334s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 54671 files and directories currently installed.) 334s Preparing to unpack .../libldap-common_2.6.7+dfsg-1~exp1ubuntu9_all.deb ... 334s Unpacking libldap-common (2.6.7+dfsg-1~exp1ubuntu9) over (2.6.7+dfsg-1~exp1ubuntu8) ... 334s Preparing to unpack .../libldap2_2.6.7+dfsg-1~exp1ubuntu9_s390x.deb ... 334s Unpacking libldap2:s390x (2.6.7+dfsg-1~exp1ubuntu9) over (2.6.7+dfsg-1~exp1ubuntu8) ... 334s Setting up libldap-common (2.6.7+dfsg-1~exp1ubuntu9) ... 334s Setting up libldap2:s390x (2.6.7+dfsg-1~exp1ubuntu9) ... 334s Processing triggers for man-db (2.12.1-2) ... 334s Processing triggers for libc-bin (2.39-0ubuntu9) ... 334s Reading package lists... 334s Building dependency tree... 334s Reading state information... 334s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 338s Reading package lists... 338s Building dependency tree... 338s Reading state information... 338s Starting pkgProblemResolver with broken count: 0 338s Starting 2 pkgProblemResolver with broken count: 0 338s Done 339s The following additional packages will be installed: 339s fonts-font-awesome fonts-glyphicons-halflings fonts-lato fonts-mathjax gdb 339s jupyter-core jupyter-notebook libbabeltrace1 libdebuginfod-common 339s libdebuginfod1t64 libjs-backbone libjs-bootstrap libjs-bootstrap-tour 339s libjs-codemirror libjs-es6-promise libjs-jed libjs-jquery 339s libjs-jquery-typeahead libjs-jquery-ui libjs-marked libjs-mathjax 339s libjs-moment libjs-requirejs libjs-requirejs-text libjs-sphinxdoc 339s libjs-text-encoding libjs-underscore libjs-xterm libnorm1t64 libpgm-5.3-0t64 339s libpython3.12t64 libsodium23 libsource-highlight-common 339s libsource-highlight4t64 libzmq5 node-jed python-notebook-doc 339s python-tinycss2-common python3-argon2 python3-asttokens python3-bleach 339s python3-bs4 python3-bytecode python3-comm python3-coverage python3-dateutil 339s python3-debugpy python3-decorator python3-defusedxml python3-entrypoints 339s python3-executing python3-fastjsonschema python3-html5lib python3-ipykernel 339s python3-ipython python3-ipython-genutils python3-jedi python3-jupyter-client 339s python3-jupyter-core python3-jupyterlab-pygments python3-matplotlib-inline 339s python3-mistune python3-nbclient python3-nbconvert python3-nbformat 339s python3-nest-asyncio python3-notebook python3-packaging 339s python3-pandocfilters python3-parso python3-pexpect python3-platformdirs 339s python3-prometheus-client python3-prompt-toolkit python3-psutil 339s python3-ptyprocess python3-pure-eval python3-py python3-pydevd 339s python3-send2trash python3-soupsieve python3-stack-data python3-terminado 339s python3-tinycss2 python3-tornado python3-traitlets python3-typeshed 339s python3-wcwidth python3-webencodings python3-zmq sphinx-rtd-theme-common 339s Suggested packages: 339s gdb-doc gdbserver libjs-jquery-lazyload libjs-json libjs-jquery-ui-docs 339s fonts-mathjax-extras fonts-stix libjs-mathjax-doc python-argon2-doc 339s python-bleach-doc python-bytecode-doc python-coverage-doc 339s python-fastjsonschema-doc python3-genshi python3-lxml python-ipython-doc 339s python3-pip python-nbconvert-doc texlive-fonts-recommended 339s texlive-plain-generic texlive-xetex python-pexpect-doc subversion 339s python3-pytest pydevd python-terminado-doc python-tinycss2-doc 339s python3-pycurl python-tornado-doc python3-twisted 339s Recommended packages: 339s libc-dbg javascript-common python3-lxml python3-matplotlib pandoc 339s python3-ipywidgets 339s The following NEW packages will be installed: 339s autopkgtest-satdep fonts-font-awesome fonts-glyphicons-halflings fonts-lato 339s fonts-mathjax gdb jupyter-core jupyter-notebook libbabeltrace1 339s libdebuginfod-common libdebuginfod1t64 libjs-backbone libjs-bootstrap 339s libjs-bootstrap-tour libjs-codemirror libjs-es6-promise libjs-jed 339s libjs-jquery libjs-jquery-typeahead libjs-jquery-ui libjs-marked 339s libjs-mathjax libjs-moment libjs-requirejs libjs-requirejs-text 339s libjs-sphinxdoc libjs-text-encoding libjs-underscore libjs-xterm libnorm1t64 339s libpgm-5.3-0t64 libpython3.12t64 libsodium23 libsource-highlight-common 339s libsource-highlight4t64 libzmq5 node-jed python-notebook-doc 339s python-tinycss2-common python3-argon2 python3-asttokens python3-bleach 339s python3-bs4 python3-bytecode python3-comm python3-coverage python3-dateutil 339s python3-debugpy python3-decorator python3-defusedxml python3-entrypoints 339s python3-executing python3-fastjsonschema python3-html5lib python3-ipykernel 339s python3-ipython python3-ipython-genutils python3-jedi python3-jupyter-client 339s python3-jupyter-core python3-jupyterlab-pygments python3-matplotlib-inline 339s python3-mistune python3-nbclient python3-nbconvert python3-nbformat 339s python3-nest-asyncio python3-notebook python3-packaging 339s python3-pandocfilters python3-parso python3-pexpect python3-platformdirs 339s python3-prometheus-client python3-prompt-toolkit python3-psutil 339s python3-ptyprocess python3-pure-eval python3-py python3-pydevd 339s python3-send2trash python3-soupsieve python3-stack-data python3-terminado 339s python3-tinycss2 python3-tornado python3-traitlets python3-typeshed 339s python3-wcwidth python3-webencodings python3-zmq sphinx-rtd-theme-common 339s 0 upgraded, 92 newly installed, 0 to remove and 0 not upgraded. 339s Need to get 33.2 MB/33.2 MB of archives. 339s After this operation, 168 MB of additional disk space will be used. 339s Get:1 /tmp/autopkgtest.s4beMp/2-autopkgtest-satdep.deb autopkgtest-satdep s390x 0 [728 B] 339s Get:2 http://ftpmaster.internal/ubuntu oracular/main s390x fonts-lato all 2.015-1 [2781 kB] 339s Get:3 http://ftpmaster.internal/ubuntu oracular/main s390x libdebuginfod-common all 0.191-1 [14.6 kB] 339s Get:4 http://ftpmaster.internal/ubuntu oracular/main s390x fonts-font-awesome all 5.0.10+really4.7.0~dfsg-4.1 [516 kB] 339s Get:5 http://ftpmaster.internal/ubuntu oracular/universe s390x fonts-glyphicons-halflings all 1.009~3.4.1+dfsg-3 [118 kB] 339s Get:6 http://ftpmaster.internal/ubuntu oracular/main s390x fonts-mathjax all 2.7.9+dfsg-1 [2208 kB] 340s Get:7 http://ftpmaster.internal/ubuntu oracular/main s390x libbabeltrace1 s390x 1.5.11-3build3 [173 kB] 340s Get:8 http://ftpmaster.internal/ubuntu oracular/main s390x libdebuginfod1t64 s390x 0.191-1 [17.6 kB] 340s Get:9 http://ftpmaster.internal/ubuntu oracular/main s390x libpython3.12t64 s390x 3.12.4-1 [2507 kB] 340s Get:10 http://ftpmaster.internal/ubuntu oracular/main s390x libsource-highlight-common all 3.1.9-4.3build1 [64.2 kB] 340s Get:11 http://ftpmaster.internal/ubuntu oracular/main s390x libsource-highlight4t64 s390x 3.1.9-4.3build1 [268 kB] 340s Get:12 http://ftpmaster.internal/ubuntu oracular/main s390x gdb s390x 15.0.50.20240403-0ubuntu1 [3899 kB] 340s Get:13 http://ftpmaster.internal/ubuntu oracular/main s390x python3-platformdirs all 4.2.1-1 [16.3 kB] 340s Get:14 http://ftpmaster.internal/ubuntu oracular-proposed/universe s390x python3-traitlets all 5.14.3-1 [71.3 kB] 340s Get:15 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-jupyter-core all 5.3.2-2 [25.5 kB] 340s Get:16 http://ftpmaster.internal/ubuntu oracular/universe s390x jupyter-core all 5.3.2-2 [4038 B] 340s Get:17 http://ftpmaster.internal/ubuntu oracular/main s390x libjs-underscore all 1.13.4~dfsg+~1.11.4-3 [118 kB] 340s Get:18 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-backbone all 1.4.1~dfsg+~1.4.15-3 [185 kB] 340s Get:19 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-bootstrap all 3.4.1+dfsg-3 [129 kB] 340s Get:20 http://ftpmaster.internal/ubuntu oracular/main s390x libjs-jquery all 3.6.1+dfsg+~3.5.14-1 [328 kB] 340s Get:21 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-bootstrap-tour all 0.12.0+dfsg-5 [21.4 kB] 340s Get:22 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-codemirror all 5.65.0+~cs5.83.9-3 [755 kB] 340s Get:23 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-es6-promise all 4.2.8-12 [14.1 kB] 340s Get:24 http://ftpmaster.internal/ubuntu oracular/universe s390x node-jed all 1.1.1-4 [15.2 kB] 340s Get:25 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-jed all 1.1.1-4 [2584 B] 340s Get:26 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-jquery-typeahead all 2.11.0+dfsg1-3 [48.9 kB] 340s Get:27 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-jquery-ui all 1.13.2+dfsg-1 [252 kB] 340s Get:28 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-marked all 4.2.3+ds+~4.0.7-3 [36.2 kB] 340s Get:29 http://ftpmaster.internal/ubuntu oracular/main s390x libjs-mathjax all 2.7.9+dfsg-1 [5665 kB] 340s Get:30 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-moment all 2.29.4+ds-1 [147 kB] 340s Get:31 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-requirejs all 2.3.6+ds+~2.1.37-1 [201 kB] 340s Get:32 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-requirejs-text all 2.0.12-1.1 [9056 B] 340s Get:33 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-text-encoding all 0.7.0-5 [140 kB] 340s Get:34 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-xterm all 5.3.0-2 [476 kB] 340s Get:35 http://ftpmaster.internal/ubuntu oracular/main s390x python3-ptyprocess all 0.7.0-5 [15.1 kB] 340s Get:36 http://ftpmaster.internal/ubuntu oracular/main s390x python3-tornado s390x 6.4.1-1 [298 kB] 340s Get:37 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-terminado all 0.18.1-1 [13.2 kB] 340s Get:38 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-argon2 s390x 21.1.0-2build1 [21.2 kB] 340s Get:39 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-comm all 0.2.1-1 [7016 B] 340s Get:40 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-bytecode all 0.15.1-3 [44.7 kB] 340s Get:41 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-coverage s390x 7.4.4+dfsg1-0ubuntu2 [147 kB] 340s Get:42 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-pydevd s390x 2.10.0+ds-10ubuntu1 [638 kB] 340s Get:43 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-debugpy all 1.8.0+ds-4ubuntu4 [67.6 kB] 340s Get:44 http://ftpmaster.internal/ubuntu oracular/main s390x python3-decorator all 5.1.1-5 [10.1 kB] 340s Get:45 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-parso all 0.8.3-1 [67.2 kB] 340s Get:46 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-typeshed all 0.0~git20231111.6764465-3 [1274 kB] 340s Get:47 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-jedi all 0.19.1+ds1-1 [693 kB] 340s Get:48 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-matplotlib-inline all 0.1.6-2 [8784 B] 340s Get:49 http://ftpmaster.internal/ubuntu oracular/main s390x python3-pexpect all 4.9-2 [48.1 kB] 340s Get:50 http://ftpmaster.internal/ubuntu oracular/main s390x python3-wcwidth all 0.2.5+dfsg1-1.1ubuntu1 [22.5 kB] 340s Get:51 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-prompt-toolkit all 3.0.46-1 [256 kB] 340s Get:52 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-asttokens all 2.4.1-1 [20.9 kB] 340s Get:53 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-executing all 2.0.1-0.1 [23.3 kB] 340s Get:54 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-pure-eval all 0.2.2-2 [11.1 kB] 340s Get:55 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-stack-data all 0.6.3-1 [22.0 kB] 340s Get:56 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-ipython all 8.20.0-1ubuntu1 [561 kB] 340s Get:57 http://ftpmaster.internal/ubuntu oracular/main s390x python3-dateutil all 2.9.0-2 [80.3 kB] 340s Get:58 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-entrypoints all 0.4-2 [7146 B] 340s Get:59 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-nest-asyncio all 1.5.4-1 [6256 B] 340s Get:60 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-py all 1.11.0-2 [72.7 kB] 340s Get:61 http://ftpmaster.internal/ubuntu oracular/universe s390x libnorm1t64 s390x 1.5.9+dfsg-3.1build1 [158 kB] 340s Get:62 http://ftpmaster.internal/ubuntu oracular/universe s390x libpgm-5.3-0t64 s390x 5.3.128~dfsg-2.1build1 [169 kB] 340s Get:63 http://ftpmaster.internal/ubuntu oracular/main s390x libsodium23 s390x 1.0.18-1build3 [138 kB] 340s Get:64 http://ftpmaster.internal/ubuntu oracular/universe s390x libzmq5 s390x 4.3.5-1build2 [258 kB] 340s Get:65 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-zmq s390x 24.0.1-5build1 [298 kB] 340s Get:66 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-jupyter-client all 7.4.9-2ubuntu1 [90.5 kB] 340s Get:67 http://ftpmaster.internal/ubuntu oracular/main s390x python3-packaging all 24.0-1 [41.1 kB] 340s Get:68 http://ftpmaster.internal/ubuntu oracular/main s390x python3-psutil s390x 5.9.8-2build2 [195 kB] 341s Get:69 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-ipykernel all 6.29.3-1ubuntu1 [82.6 kB] 341s Get:70 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-ipython-genutils all 0.2.0-6 [22.0 kB] 341s Get:71 http://ftpmaster.internal/ubuntu oracular/main s390x python3-webencodings all 0.5.1-5 [11.5 kB] 341s Get:72 http://ftpmaster.internal/ubuntu oracular/main s390x python3-html5lib all 1.1-6 [88.8 kB] 341s Get:73 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-bleach all 6.1.0-2 [49.6 kB] 341s Get:74 http://ftpmaster.internal/ubuntu oracular/main s390x python3-soupsieve all 2.5-1 [33.0 kB] 341s Get:75 http://ftpmaster.internal/ubuntu oracular/main s390x python3-bs4 all 4.12.3-1 [109 kB] 341s Get:76 http://ftpmaster.internal/ubuntu oracular/main s390x python3-defusedxml all 0.7.1-2 [42.0 kB] 341s Get:77 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-jupyterlab-pygments all 0.2.2-3 [6054 B] 341s Get:78 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-mistune all 3.0.2-1 [32.8 kB] 341s Get:79 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-fastjsonschema all 2.19.1-1 [19.7 kB] 341s Get:80 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-nbformat all 5.9.1-1 [41.2 kB] 341s Get:81 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-nbclient all 0.8.0-1 [55.6 kB] 341s Get:82 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-pandocfilters all 1.5.1-1 [23.6 kB] 341s Get:83 http://ftpmaster.internal/ubuntu oracular/universe s390x python-tinycss2-common all 1.3.0-1 [34.1 kB] 341s Get:84 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-tinycss2 all 1.3.0-1 [19.6 kB] 341s Get:85 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-nbconvert all 7.16.4-1 [156 kB] 341s Get:86 http://ftpmaster.internal/ubuntu oracular/main s390x python3-prometheus-client all 0.19.0+ds1-1 [41.7 kB] 341s Get:87 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-send2trash all 1.8.2-1 [15.5 kB] 341s Get:88 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-notebook all 6.4.12-2.2ubuntu1 [1566 kB] 341s Get:89 http://ftpmaster.internal/ubuntu oracular/universe s390x jupyter-notebook all 6.4.12-2.2ubuntu1 [10.4 kB] 341s Get:90 http://ftpmaster.internal/ubuntu oracular/main s390x libjs-sphinxdoc all 7.2.6-8 [150 kB] 341s Get:91 http://ftpmaster.internal/ubuntu oracular/main s390x sphinx-rtd-theme-common all 2.0.0+dfsg-1 [1012 kB] 341s Get:92 http://ftpmaster.internal/ubuntu oracular/universe s390x python-notebook-doc all 6.4.12-2.2ubuntu1 [2540 kB] 341s Preconfiguring packages ... 341s Fetched 33.2 MB in 2s (15.7 MB/s) 341s Selecting previously unselected package fonts-lato. 341s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 54671 files and directories currently installed.) 341s Preparing to unpack .../00-fonts-lato_2.015-1_all.deb ... 341s Unpacking fonts-lato (2.015-1) ... 342s Selecting previously unselected package libdebuginfod-common. 342s Preparing to unpack .../01-libdebuginfod-common_0.191-1_all.deb ... 342s Unpacking libdebuginfod-common (0.191-1) ... 342s Selecting previously unselected package fonts-font-awesome. 342s Preparing to unpack .../02-fonts-font-awesome_5.0.10+really4.7.0~dfsg-4.1_all.deb ... 342s Unpacking fonts-font-awesome (5.0.10+really4.7.0~dfsg-4.1) ... 342s Selecting previously unselected package fonts-glyphicons-halflings. 342s Preparing to unpack .../03-fonts-glyphicons-halflings_1.009~3.4.1+dfsg-3_all.deb ... 342s Unpacking fonts-glyphicons-halflings (1.009~3.4.1+dfsg-3) ... 342s Selecting previously unselected package fonts-mathjax. 342s Preparing to unpack .../04-fonts-mathjax_2.7.9+dfsg-1_all.deb ... 342s Unpacking fonts-mathjax (2.7.9+dfsg-1) ... 342s Selecting previously unselected package libbabeltrace1:s390x. 342s Preparing to unpack .../05-libbabeltrace1_1.5.11-3build3_s390x.deb ... 342s Unpacking libbabeltrace1:s390x (1.5.11-3build3) ... 342s Selecting previously unselected package libdebuginfod1t64:s390x. 342s Preparing to unpack .../06-libdebuginfod1t64_0.191-1_s390x.deb ... 342s Unpacking libdebuginfod1t64:s390x (0.191-1) ... 342s Selecting previously unselected package libpython3.12t64:s390x. 342s Preparing to unpack .../07-libpython3.12t64_3.12.4-1_s390x.deb ... 342s Unpacking libpython3.12t64:s390x (3.12.4-1) ... 342s Selecting previously unselected package libsource-highlight-common. 342s Preparing to unpack .../08-libsource-highlight-common_3.1.9-4.3build1_all.deb ... 342s Unpacking libsource-highlight-common (3.1.9-4.3build1) ... 342s Selecting previously unselected package libsource-highlight4t64:s390x. 342s Preparing to unpack .../09-libsource-highlight4t64_3.1.9-4.3build1_s390x.deb ... 342s Unpacking libsource-highlight4t64:s390x (3.1.9-4.3build1) ... 342s Selecting previously unselected package gdb. 342s Preparing to unpack .../10-gdb_15.0.50.20240403-0ubuntu1_s390x.deb ... 342s Unpacking gdb (15.0.50.20240403-0ubuntu1) ... 342s Selecting previously unselected package python3-platformdirs. 342s Preparing to unpack .../11-python3-platformdirs_4.2.1-1_all.deb ... 342s Unpacking python3-platformdirs (4.2.1-1) ... 342s Selecting previously unselected package python3-traitlets. 342s Preparing to unpack .../12-python3-traitlets_5.14.3-1_all.deb ... 342s Unpacking python3-traitlets (5.14.3-1) ... 342s Selecting previously unselected package python3-jupyter-core. 342s Preparing to unpack .../13-python3-jupyter-core_5.3.2-2_all.deb ... 342s Unpacking python3-jupyter-core (5.3.2-2) ... 342s Selecting previously unselected package jupyter-core. 342s Preparing to unpack .../14-jupyter-core_5.3.2-2_all.deb ... 342s Unpacking jupyter-core (5.3.2-2) ... 342s Selecting previously unselected package libjs-underscore. 342s Preparing to unpack .../15-libjs-underscore_1.13.4~dfsg+~1.11.4-3_all.deb ... 342s Unpacking libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... 342s Selecting previously unselected package libjs-backbone. 342s Preparing to unpack .../16-libjs-backbone_1.4.1~dfsg+~1.4.15-3_all.deb ... 342s Unpacking libjs-backbone (1.4.1~dfsg+~1.4.15-3) ... 343s Selecting previously unselected package libjs-bootstrap. 343s Preparing to unpack .../17-libjs-bootstrap_3.4.1+dfsg-3_all.deb ... 343s Unpacking libjs-bootstrap (3.4.1+dfsg-3) ... 343s Selecting previously unselected package libjs-jquery. 343s Preparing to unpack .../18-libjs-jquery_3.6.1+dfsg+~3.5.14-1_all.deb ... 343s Unpacking libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 343s Selecting previously unselected package libjs-bootstrap-tour. 343s Preparing to unpack .../19-libjs-bootstrap-tour_0.12.0+dfsg-5_all.deb ... 343s Unpacking libjs-bootstrap-tour (0.12.0+dfsg-5) ... 343s Selecting previously unselected package libjs-codemirror. 343s Preparing to unpack .../20-libjs-codemirror_5.65.0+~cs5.83.9-3_all.deb ... 343s Unpacking libjs-codemirror (5.65.0+~cs5.83.9-3) ... 343s Selecting previously unselected package libjs-es6-promise. 343s Preparing to unpack .../21-libjs-es6-promise_4.2.8-12_all.deb ... 343s Unpacking libjs-es6-promise (4.2.8-12) ... 343s Selecting previously unselected package node-jed. 343s Preparing to unpack .../22-node-jed_1.1.1-4_all.deb ... 343s Unpacking node-jed (1.1.1-4) ... 343s Selecting previously unselected package libjs-jed. 343s Preparing to unpack .../23-libjs-jed_1.1.1-4_all.deb ... 343s Unpacking libjs-jed (1.1.1-4) ... 343s Selecting previously unselected package libjs-jquery-typeahead. 343s Preparing to unpack .../24-libjs-jquery-typeahead_2.11.0+dfsg1-3_all.deb ... 343s Unpacking libjs-jquery-typeahead (2.11.0+dfsg1-3) ... 343s Selecting previously unselected package libjs-jquery-ui. 343s Preparing to unpack .../25-libjs-jquery-ui_1.13.2+dfsg-1_all.deb ... 343s Unpacking libjs-jquery-ui (1.13.2+dfsg-1) ... 343s Selecting previously unselected package libjs-marked. 343s Preparing to unpack .../26-libjs-marked_4.2.3+ds+~4.0.7-3_all.deb ... 343s Unpacking libjs-marked (4.2.3+ds+~4.0.7-3) ... 343s Selecting previously unselected package libjs-mathjax. 343s Preparing to unpack .../27-libjs-mathjax_2.7.9+dfsg-1_all.deb ... 343s Unpacking libjs-mathjax (2.7.9+dfsg-1) ... 344s Selecting previously unselected package libjs-moment. 344s Preparing to unpack .../28-libjs-moment_2.29.4+ds-1_all.deb ... 344s Unpacking libjs-moment (2.29.4+ds-1) ... 344s Selecting previously unselected package libjs-requirejs. 344s Preparing to unpack .../29-libjs-requirejs_2.3.6+ds+~2.1.37-1_all.deb ... 344s Unpacking libjs-requirejs (2.3.6+ds+~2.1.37-1) ... 344s Selecting previously unselected package libjs-requirejs-text. 344s Preparing to unpack .../30-libjs-requirejs-text_2.0.12-1.1_all.deb ... 344s Unpacking libjs-requirejs-text (2.0.12-1.1) ... 344s Selecting previously unselected package libjs-text-encoding. 344s Preparing to unpack .../31-libjs-text-encoding_0.7.0-5_all.deb ... 344s Unpacking libjs-text-encoding (0.7.0-5) ... 344s Selecting previously unselected package libjs-xterm. 344s Preparing to unpack .../32-libjs-xterm_5.3.0-2_all.deb ... 344s Unpacking libjs-xterm (5.3.0-2) ... 344s Selecting previously unselected package python3-ptyprocess. 344s Preparing to unpack .../33-python3-ptyprocess_0.7.0-5_all.deb ... 344s Unpacking python3-ptyprocess (0.7.0-5) ... 344s Selecting previously unselected package python3-tornado. 344s Preparing to unpack .../34-python3-tornado_6.4.1-1_s390x.deb ... 344s Unpacking python3-tornado (6.4.1-1) ... 345s Selecting previously unselected package python3-terminado. 345s Preparing to unpack .../35-python3-terminado_0.18.1-1_all.deb ... 345s Unpacking python3-terminado (0.18.1-1) ... 345s Selecting previously unselected package python3-argon2. 345s Preparing to unpack .../36-python3-argon2_21.1.0-2build1_s390x.deb ... 345s Unpacking python3-argon2 (21.1.0-2build1) ... 345s Selecting previously unselected package python3-comm. 345s Preparing to unpack .../37-python3-comm_0.2.1-1_all.deb ... 345s Unpacking python3-comm (0.2.1-1) ... 345s Selecting previously unselected package python3-bytecode. 345s Preparing to unpack .../38-python3-bytecode_0.15.1-3_all.deb ... 345s Unpacking python3-bytecode (0.15.1-3) ... 345s Selecting previously unselected package python3-coverage. 345s Preparing to unpack .../39-python3-coverage_7.4.4+dfsg1-0ubuntu2_s390x.deb ... 345s Unpacking python3-coverage (7.4.4+dfsg1-0ubuntu2) ... 345s Selecting previously unselected package python3-pydevd. 345s Preparing to unpack .../40-python3-pydevd_2.10.0+ds-10ubuntu1_s390x.deb ... 345s Unpacking python3-pydevd (2.10.0+ds-10ubuntu1) ... 345s Selecting previously unselected package python3-debugpy. 345s Preparing to unpack .../41-python3-debugpy_1.8.0+ds-4ubuntu4_all.deb ... 345s Unpacking python3-debugpy (1.8.0+ds-4ubuntu4) ... 345s Selecting previously unselected package python3-decorator. 345s Preparing to unpack .../42-python3-decorator_5.1.1-5_all.deb ... 345s Unpacking python3-decorator (5.1.1-5) ... 345s Selecting previously unselected package python3-parso. 345s Preparing to unpack .../43-python3-parso_0.8.3-1_all.deb ... 345s Unpacking python3-parso (0.8.3-1) ... 345s Selecting previously unselected package python3-typeshed. 345s Preparing to unpack .../44-python3-typeshed_0.0~git20231111.6764465-3_all.deb ... 345s Unpacking python3-typeshed (0.0~git20231111.6764465-3) ... 345s Selecting previously unselected package python3-jedi. 345s Preparing to unpack .../45-python3-jedi_0.19.1+ds1-1_all.deb ... 345s Unpacking python3-jedi (0.19.1+ds1-1) ... 346s Selecting previously unselected package python3-matplotlib-inline. 346s Preparing to unpack .../46-python3-matplotlib-inline_0.1.6-2_all.deb ... 346s Unpacking python3-matplotlib-inline (0.1.6-2) ... 346s Selecting previously unselected package python3-pexpect. 346s Preparing to unpack .../47-python3-pexpect_4.9-2_all.deb ... 346s Unpacking python3-pexpect (4.9-2) ... 346s Selecting previously unselected package python3-wcwidth. 346s Preparing to unpack .../48-python3-wcwidth_0.2.5+dfsg1-1.1ubuntu1_all.deb ... 346s Unpacking python3-wcwidth (0.2.5+dfsg1-1.1ubuntu1) ... 346s Selecting previously unselected package python3-prompt-toolkit. 346s Preparing to unpack .../49-python3-prompt-toolkit_3.0.46-1_all.deb ... 346s Unpacking python3-prompt-toolkit (3.0.46-1) ... 346s Selecting previously unselected package python3-asttokens. 346s Preparing to unpack .../50-python3-asttokens_2.4.1-1_all.deb ... 346s Unpacking python3-asttokens (2.4.1-1) ... 346s Selecting previously unselected package python3-executing. 346s Preparing to unpack .../51-python3-executing_2.0.1-0.1_all.deb ... 346s Unpacking python3-executing (2.0.1-0.1) ... 346s Selecting previously unselected package python3-pure-eval. 346s Preparing to unpack .../52-python3-pure-eval_0.2.2-2_all.deb ... 346s Unpacking python3-pure-eval (0.2.2-2) ... 346s Selecting previously unselected package python3-stack-data. 346s Preparing to unpack .../53-python3-stack-data_0.6.3-1_all.deb ... 346s Unpacking python3-stack-data (0.6.3-1) ... 346s Selecting previously unselected package python3-ipython. 346s Preparing to unpack .../54-python3-ipython_8.20.0-1ubuntu1_all.deb ... 346s Unpacking python3-ipython (8.20.0-1ubuntu1) ... 346s Selecting previously unselected package python3-dateutil. 346s Preparing to unpack .../55-python3-dateutil_2.9.0-2_all.deb ... 346s Unpacking python3-dateutil (2.9.0-2) ... 346s Selecting previously unselected package python3-entrypoints. 346s Preparing to unpack .../56-python3-entrypoints_0.4-2_all.deb ... 346s Unpacking python3-entrypoints (0.4-2) ... 346s Selecting previously unselected package python3-nest-asyncio. 346s Preparing to unpack .../57-python3-nest-asyncio_1.5.4-1_all.deb ... 346s Unpacking python3-nest-asyncio (1.5.4-1) ... 346s Selecting previously unselected package python3-py. 346s Preparing to unpack .../58-python3-py_1.11.0-2_all.deb ... 346s Unpacking python3-py (1.11.0-2) ... 346s Selecting previously unselected package libnorm1t64:s390x. 346s Preparing to unpack .../59-libnorm1t64_1.5.9+dfsg-3.1build1_s390x.deb ... 346s Unpacking libnorm1t64:s390x (1.5.9+dfsg-3.1build1) ... 346s Selecting previously unselected package libpgm-5.3-0t64:s390x. 346s Preparing to unpack .../60-libpgm-5.3-0t64_5.3.128~dfsg-2.1build1_s390x.deb ... 346s Unpacking libpgm-5.3-0t64:s390x (5.3.128~dfsg-2.1build1) ... 346s Selecting previously unselected package libsodium23:s390x. 346s Preparing to unpack .../61-libsodium23_1.0.18-1build3_s390x.deb ... 346s Unpacking libsodium23:s390x (1.0.18-1build3) ... 346s Selecting previously unselected package libzmq5:s390x. 346s Preparing to unpack .../62-libzmq5_4.3.5-1build2_s390x.deb ... 346s Unpacking libzmq5:s390x (4.3.5-1build2) ... 346s Selecting previously unselected package python3-zmq. 346s Preparing to unpack .../63-python3-zmq_24.0.1-5build1_s390x.deb ... 346s Unpacking python3-zmq (24.0.1-5build1) ... 346s Selecting previously unselected package python3-jupyter-client. 346s Preparing to unpack .../64-python3-jupyter-client_7.4.9-2ubuntu1_all.deb ... 346s Unpacking python3-jupyter-client (7.4.9-2ubuntu1) ... 346s Selecting previously unselected package python3-packaging. 346s Preparing to unpack .../65-python3-packaging_24.0-1_all.deb ... 346s Unpacking python3-packaging (24.0-1) ... 346s Selecting previously unselected package python3-psutil. 346s Preparing to unpack .../66-python3-psutil_5.9.8-2build2_s390x.deb ... 346s Unpacking python3-psutil (5.9.8-2build2) ... 346s Selecting previously unselected package python3-ipykernel. 346s Preparing to unpack .../67-python3-ipykernel_6.29.3-1ubuntu1_all.deb ... 346s Unpacking python3-ipykernel (6.29.3-1ubuntu1) ... 346s Selecting previously unselected package python3-ipython-genutils. 346s Preparing to unpack .../68-python3-ipython-genutils_0.2.0-6_all.deb ... 346s Unpacking python3-ipython-genutils (0.2.0-6) ... 346s Selecting previously unselected package python3-webencodings. 346s Preparing to unpack .../69-python3-webencodings_0.5.1-5_all.deb ... 346s Unpacking python3-webencodings (0.5.1-5) ... 346s Selecting previously unselected package python3-html5lib. 346s Preparing to unpack .../70-python3-html5lib_1.1-6_all.deb ... 346s Unpacking python3-html5lib (1.1-6) ... 346s Selecting previously unselected package python3-bleach. 346s Preparing to unpack .../71-python3-bleach_6.1.0-2_all.deb ... 346s Unpacking python3-bleach (6.1.0-2) ... 346s Selecting previously unselected package python3-soupsieve. 346s Preparing to unpack .../72-python3-soupsieve_2.5-1_all.deb ... 346s Unpacking python3-soupsieve (2.5-1) ... 346s Selecting previously unselected package python3-bs4. 346s Preparing to unpack .../73-python3-bs4_4.12.3-1_all.deb ... 346s Unpacking python3-bs4 (4.12.3-1) ... 346s Selecting previously unselected package python3-defusedxml. 346s Preparing to unpack .../74-python3-defusedxml_0.7.1-2_all.deb ... 346s Unpacking python3-defusedxml (0.7.1-2) ... 346s Selecting previously unselected package python3-jupyterlab-pygments. 346s Preparing to unpack .../75-python3-jupyterlab-pygments_0.2.2-3_all.deb ... 346s Unpacking python3-jupyterlab-pygments (0.2.2-3) ... 346s Selecting previously unselected package python3-mistune. 346s Preparing to unpack .../76-python3-mistune_3.0.2-1_all.deb ... 346s Unpacking python3-mistune (3.0.2-1) ... 346s Selecting previously unselected package python3-fastjsonschema. 346s Preparing to unpack .../77-python3-fastjsonschema_2.19.1-1_all.deb ... 346s Unpacking python3-fastjsonschema (2.19.1-1) ... 346s Selecting previously unselected package python3-nbformat. 346s Preparing to unpack .../78-python3-nbformat_5.9.1-1_all.deb ... 346s Unpacking python3-nbformat (5.9.1-1) ... 346s Selecting previously unselected package python3-nbclient. 346s Preparing to unpack .../79-python3-nbclient_0.8.0-1_all.deb ... 346s Unpacking python3-nbclient (0.8.0-1) ... 346s Selecting previously unselected package python3-pandocfilters. 346s Preparing to unpack .../80-python3-pandocfilters_1.5.1-1_all.deb ... 346s Unpacking python3-pandocfilters (1.5.1-1) ... 346s Selecting previously unselected package python-tinycss2-common. 346s Preparing to unpack .../81-python-tinycss2-common_1.3.0-1_all.deb ... 346s Unpacking python-tinycss2-common (1.3.0-1) ... 346s Selecting previously unselected package python3-tinycss2. 346s Preparing to unpack .../82-python3-tinycss2_1.3.0-1_all.deb ... 346s Unpacking python3-tinycss2 (1.3.0-1) ... 347s Selecting previously unselected package python3-nbconvert. 347s Preparing to unpack .../83-python3-nbconvert_7.16.4-1_all.deb ... 347s Unpacking python3-nbconvert (7.16.4-1) ... 347s Selecting previously unselected package python3-prometheus-client. 347s Preparing to unpack .../84-python3-prometheus-client_0.19.0+ds1-1_all.deb ... 347s Unpacking python3-prometheus-client (0.19.0+ds1-1) ... 347s Selecting previously unselected package python3-send2trash. 347s Preparing to unpack .../85-python3-send2trash_1.8.2-1_all.deb ... 347s Unpacking python3-send2trash (1.8.2-1) ... 347s Selecting previously unselected package python3-notebook. 347s Preparing to unpack .../86-python3-notebook_6.4.12-2.2ubuntu1_all.deb ... 347s Unpacking python3-notebook (6.4.12-2.2ubuntu1) ... 347s Selecting previously unselected package jupyter-notebook. 347s Preparing to unpack .../87-jupyter-notebook_6.4.12-2.2ubuntu1_all.deb ... 347s Unpacking jupyter-notebook (6.4.12-2.2ubuntu1) ... 347s Selecting previously unselected package libjs-sphinxdoc. 347s Preparing to unpack .../88-libjs-sphinxdoc_7.2.6-8_all.deb ... 347s Unpacking libjs-sphinxdoc (7.2.6-8) ... 347s Selecting previously unselected package sphinx-rtd-theme-common. 347s Preparing to unpack .../89-sphinx-rtd-theme-common_2.0.0+dfsg-1_all.deb ... 347s Unpacking sphinx-rtd-theme-common (2.0.0+dfsg-1) ... 347s Selecting previously unselected package python-notebook-doc. 347s Preparing to unpack .../90-python-notebook-doc_6.4.12-2.2ubuntu1_all.deb ... 347s Unpacking python-notebook-doc (6.4.12-2.2ubuntu1) ... 347s Selecting previously unselected package autopkgtest-satdep. 347s Preparing to unpack .../91-2-autopkgtest-satdep.deb ... 347s Unpacking autopkgtest-satdep (0) ... 347s Setting up python3-entrypoints (0.4-2) ... 347s Setting up libjs-jquery-typeahead (2.11.0+dfsg1-3) ... 347s Setting up python3-tornado (6.4.1-1) ... 348s Setting up libnorm1t64:s390x (1.5.9+dfsg-3.1build1) ... 348s Setting up python3-pure-eval (0.2.2-2) ... 348s Setting up python3-send2trash (1.8.2-1) ... 348s Setting up fonts-lato (2.015-1) ... 348s Setting up fonts-mathjax (2.7.9+dfsg-1) ... 348s Setting up libsodium23:s390x (1.0.18-1build3) ... 348s Setting up libjs-mathjax (2.7.9+dfsg-1) ... 348s Setting up python3-py (1.11.0-2) ... 348s Setting up libdebuginfod-common (0.191-1) ... 348s Setting up libjs-requirejs-text (2.0.12-1.1) ... 348s Setting up python3-parso (0.8.3-1) ... 348s Setting up python3-defusedxml (0.7.1-2) ... 349s Setting up python3-ipython-genutils (0.2.0-6) ... 349s Setting up python3-asttokens (2.4.1-1) ... 349s Setting up fonts-glyphicons-halflings (1.009~3.4.1+dfsg-3) ... 349s Setting up python3-coverage (7.4.4+dfsg1-0ubuntu2) ... 349s Setting up libjs-moment (2.29.4+ds-1) ... 349s Setting up python3-pandocfilters (1.5.1-1) ... 349s Setting up libjs-requirejs (2.3.6+ds+~2.1.37-1) ... 349s Setting up libjs-es6-promise (4.2.8-12) ... 349s Setting up libjs-text-encoding (0.7.0-5) ... 349s Setting up python3-webencodings (0.5.1-5) ... 349s Setting up python3-platformdirs (4.2.1-1) ... 350s Setting up python3-psutil (5.9.8-2build2) ... 350s Setting up libsource-highlight-common (3.1.9-4.3build1) ... 350s Setting up python3-jupyterlab-pygments (0.2.2-3) ... 350s Setting up libpython3.12t64:s390x (3.12.4-1) ... 350s Setting up libpgm-5.3-0t64:s390x (5.3.128~dfsg-2.1build1) ... 350s Setting up python3-decorator (5.1.1-5) ... 350s Setting up python3-packaging (24.0-1) ... 350s Setting up python3-wcwidth (0.2.5+dfsg1-1.1ubuntu1) ... 351s Setting up node-jed (1.1.1-4) ... 351s Setting up python3-typeshed (0.0~git20231111.6764465-3) ... 351s Setting up python3-executing (2.0.1-0.1) ... 351s Setting up libjs-xterm (5.3.0-2) ... 351s Setting up python3-nest-asyncio (1.5.4-1) ... 351s Setting up python3-bytecode (0.15.1-3) ... 351s Setting up libjs-codemirror (5.65.0+~cs5.83.9-3) ... 351s Setting up libjs-jed (1.1.1-4) ... 351s Setting up python3-html5lib (1.1-6) ... 351s Setting up libbabeltrace1:s390x (1.5.11-3build3) ... 351s Setting up python3-fastjsonschema (2.19.1-1) ... 352s Setting up python3-traitlets (5.14.3-1) ... 352s Setting up python-tinycss2-common (1.3.0-1) ... 352s Setting up python3-argon2 (21.1.0-2build1) ... 352s Setting up python3-dateutil (2.9.0-2) ... 352s Setting up libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 352s Setting up python3-mistune (3.0.2-1) ... 353s Setting up python3-stack-data (0.6.3-1) ... 353s Setting up python3-soupsieve (2.5-1) ... 353s Setting up fonts-font-awesome (5.0.10+really4.7.0~dfsg-4.1) ... 353s Setting up sphinx-rtd-theme-common (2.0.0+dfsg-1) ... 353s Setting up python3-jupyter-core (5.3.2-2) ... 353s Setting up libjs-bootstrap (3.4.1+dfsg-3) ... 353s Setting up libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... 353s Setting up python3-ptyprocess (0.7.0-5) ... 353s Setting up libjs-marked (4.2.3+ds+~4.0.7-3) ... 353s Setting up python3-prompt-toolkit (3.0.46-1) ... 354s Setting up libdebuginfod1t64:s390x (0.191-1) ... 354s Setting up python3-tinycss2 (1.3.0-1) ... 354s Setting up libzmq5:s390x (4.3.5-1build2) ... 354s Setting up python3-jedi (0.19.1+ds1-1) ... 354s Setting up libjs-bootstrap-tour (0.12.0+dfsg-5) ... 354s Setting up libjs-backbone (1.4.1~dfsg+~1.4.15-3) ... 354s Setting up libsource-highlight4t64:s390x (3.1.9-4.3build1) ... 354s Setting up python3-nbformat (5.9.1-1) ... 355s Setting up python3-bs4 (4.12.3-1) ... 355s Setting up python3-bleach (6.1.0-2) ... 355s Setting up python3-matplotlib-inline (0.1.6-2) ... 355s Setting up python3-comm (0.2.1-1) ... 355s Setting up python3-prometheus-client (0.19.0+ds1-1) ... 355s Setting up gdb (15.0.50.20240403-0ubuntu1) ... 355s Setting up libjs-jquery-ui (1.13.2+dfsg-1) ... 355s Setting up python3-pexpect (4.9-2) ... 356s Setting up python3-zmq (24.0.1-5build1) ... 356s Setting up libjs-sphinxdoc (7.2.6-8) ... 356s Setting up python3-terminado (0.18.1-1) ... 356s Setting up python3-jupyter-client (7.4.9-2ubuntu1) ... 356s Setting up jupyter-core (5.3.2-2) ... 356s Setting up python3-pydevd (2.10.0+ds-10ubuntu1) ... 357s Setting up python3-debugpy (1.8.0+ds-4ubuntu4) ... 357s Setting up python-notebook-doc (6.4.12-2.2ubuntu1) ... 357s Setting up python3-nbclient (0.8.0-1) ... 357s Setting up python3-ipython (8.20.0-1ubuntu1) ... 358s Setting up python3-ipykernel (6.29.3-1ubuntu1) ... 358s Setting up python3-nbconvert (7.16.4-1) ... 359s Setting up python3-notebook (6.4.12-2.2ubuntu1) ... 359s Setting up jupyter-notebook (6.4.12-2.2ubuntu1) ... 359s Setting up autopkgtest-satdep (0) ... 359s Processing triggers for man-db (2.12.1-2) ... 360s Processing triggers for libc-bin (2.39-0ubuntu9) ... 364s (Reading database ... 71132 files and directories currently installed.) 364s Removing autopkgtest-satdep (0) ... 366s autopkgtest [10:30:14]: test command1: find /usr/lib/python3/dist-packages/notebook -xtype l >&2 366s autopkgtest [10:30:14]: test command1: [----------------------- 366s autopkgtest [10:30:14]: test command1: -----------------------] 367s command1 PASS (superficial) 367s autopkgtest [10:30:15]: test command1: - - - - - - - - - - results - - - - - - - - - - 367s autopkgtest [10:30:15]: test autodep8-python3: preparing testbed 536s autopkgtest [10:33:04]: testbed dpkg architecture: s390x 536s autopkgtest [10:33:04]: testbed apt version: 2.9.5 536s autopkgtest [10:33:04]: @@@@@@@@@@@@@@@@@@@@ test bed setup 537s Get:1 http://ftpmaster.internal/ubuntu oracular-proposed InRelease [110 kB] 538s Get:2 http://ftpmaster.internal/ubuntu oracular-proposed/multiverse Sources [2576 B] 538s Get:3 http://ftpmaster.internal/ubuntu oracular-proposed/universe Sources [389 kB] 538s Get:4 http://ftpmaster.internal/ubuntu oracular-proposed/restricted Sources [7052 B] 538s Get:5 http://ftpmaster.internal/ubuntu oracular-proposed/main Sources [36.1 kB] 538s Get:6 http://ftpmaster.internal/ubuntu oracular-proposed/main s390x Packages [43.9 kB] 538s Get:7 http://ftpmaster.internal/ubuntu oracular-proposed/restricted s390x Packages [1860 B] 538s Get:8 http://ftpmaster.internal/ubuntu oracular-proposed/universe s390x Packages [298 kB] 538s Get:9 http://ftpmaster.internal/ubuntu oracular-proposed/multiverse s390x Packages [2528 B] 538s Fetched 892 kB in 1s (1154 kB/s) 538s Reading package lists... 540s Reading package lists... 540s Building dependency tree... 540s Reading state information... 540s Calculating upgrade... 540s The following packages will be upgraded: 540s libldap-common libldap2 541s 2 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 541s Need to get 230 kB of archives. 541s After this operation, 16.4 kB disk space will be freed. 541s Get:1 http://ftpmaster.internal/ubuntu oracular/main s390x libldap-common all 2.6.7+dfsg-1~exp1ubuntu9 [31.5 kB] 541s Get:2 http://ftpmaster.internal/ubuntu oracular/main s390x libldap2 s390x 2.6.7+dfsg-1~exp1ubuntu9 [199 kB] 541s Fetched 230 kB in 0s (562 kB/s) 541s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 54671 files and directories currently installed.) 541s Preparing to unpack .../libldap-common_2.6.7+dfsg-1~exp1ubuntu9_all.deb ... 541s Unpacking libldap-common (2.6.7+dfsg-1~exp1ubuntu9) over (2.6.7+dfsg-1~exp1ubuntu8) ... 541s Preparing to unpack .../libldap2_2.6.7+dfsg-1~exp1ubuntu9_s390x.deb ... 541s Unpacking libldap2:s390x (2.6.7+dfsg-1~exp1ubuntu9) over (2.6.7+dfsg-1~exp1ubuntu8) ... 541s Setting up libldap-common (2.6.7+dfsg-1~exp1ubuntu9) ... 541s Setting up libldap2:s390x (2.6.7+dfsg-1~exp1ubuntu9) ... 541s Processing triggers for man-db (2.12.1-2) ... 541s Processing triggers for libc-bin (2.39-0ubuntu9) ... 542s Reading package lists... 542s Building dependency tree... 542s Reading state information... 542s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 542s Hit:1 http://ftpmaster.internal/ubuntu oracular-proposed InRelease 543s Hit:2 http://ftpmaster.internal/ubuntu oracular InRelease 543s Hit:3 http://ftpmaster.internal/ubuntu oracular-updates InRelease 543s Hit:4 http://ftpmaster.internal/ubuntu oracular-security InRelease 543s Reading package lists... 544s Reading package lists... 544s Building dependency tree... 544s Reading state information... 544s Calculating upgrade... 544s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 544s Reading package lists... 544s Building dependency tree... 544s Reading state information... 544s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 548s Reading package lists... 548s Building dependency tree... 548s Reading state information... 548s Starting pkgProblemResolver with broken count: 0 548s Starting 2 pkgProblemResolver with broken count: 0 548s Done 548s The following additional packages will be installed: 548s fonts-font-awesome fonts-glyphicons-halflings fonts-mathjax gdb 548s libbabeltrace1 libdebuginfod-common libdebuginfod1t64 libjs-backbone 548s libjs-bootstrap libjs-bootstrap-tour libjs-codemirror libjs-es6-promise 548s libjs-jed libjs-jquery libjs-jquery-typeahead libjs-jquery-ui libjs-marked 548s libjs-mathjax libjs-moment libjs-requirejs libjs-requirejs-text 548s libjs-text-encoding libjs-underscore libjs-xterm libnorm1t64 libpgm-5.3-0t64 548s libpython3.12t64 libsodium23 libsource-highlight-common 548s libsource-highlight4t64 libzmq5 node-jed python-tinycss2-common python3-all 548s python3-argon2 python3-asttokens python3-bleach python3-bs4 python3-bytecode 548s python3-comm python3-coverage python3-dateutil python3-debugpy 548s python3-decorator python3-defusedxml python3-entrypoints python3-executing 548s python3-fastjsonschema python3-html5lib python3-ipykernel python3-ipython 548s python3-ipython-genutils python3-jedi python3-jupyter-client 548s python3-jupyter-core python3-jupyterlab-pygments python3-matplotlib-inline 548s python3-mistune python3-nbclient python3-nbconvert python3-nbformat 548s python3-nest-asyncio python3-notebook python3-packaging 548s python3-pandocfilters python3-parso python3-pexpect python3-platformdirs 548s python3-prometheus-client python3-prompt-toolkit python3-psutil 548s python3-ptyprocess python3-pure-eval python3-py python3-pydevd 548s python3-send2trash python3-soupsieve python3-stack-data python3-terminado 548s python3-tinycss2 python3-tornado python3-traitlets python3-typeshed 548s python3-wcwidth python3-webencodings python3-zmq 548s Suggested packages: 548s gdb-doc gdbserver libjs-jquery-lazyload libjs-json libjs-jquery-ui-docs 548s fonts-mathjax-extras fonts-stix libjs-mathjax-doc python-argon2-doc 548s python-bleach-doc python-bytecode-doc python-coverage-doc 548s python-fastjsonschema-doc python3-genshi python3-lxml python-ipython-doc 548s python3-pip python-nbconvert-doc texlive-fonts-recommended 548s texlive-plain-generic texlive-xetex python-notebook-doc python-pexpect-doc 548s subversion python3-pytest pydevd python-terminado-doc python-tinycss2-doc 548s python3-pycurl python-tornado-doc python3-twisted 548s Recommended packages: 548s libc-dbg javascript-common python3-lxml python3-matplotlib pandoc 548s python3-ipywidgets 548s The following NEW packages will be installed: 548s autopkgtest-satdep fonts-font-awesome fonts-glyphicons-halflings 548s fonts-mathjax gdb libbabeltrace1 libdebuginfod-common libdebuginfod1t64 548s libjs-backbone libjs-bootstrap libjs-bootstrap-tour libjs-codemirror 548s libjs-es6-promise libjs-jed libjs-jquery libjs-jquery-typeahead 548s libjs-jquery-ui libjs-marked libjs-mathjax libjs-moment libjs-requirejs 548s libjs-requirejs-text libjs-text-encoding libjs-underscore libjs-xterm 548s libnorm1t64 libpgm-5.3-0t64 libpython3.12t64 libsodium23 548s libsource-highlight-common libsource-highlight4t64 libzmq5 node-jed 548s python-tinycss2-common python3-all python3-argon2 python3-asttokens 548s python3-bleach python3-bs4 python3-bytecode python3-comm python3-coverage 548s python3-dateutil python3-debugpy python3-decorator python3-defusedxml 548s python3-entrypoints python3-executing python3-fastjsonschema 548s python3-html5lib python3-ipykernel python3-ipython python3-ipython-genutils 548s python3-jedi python3-jupyter-client python3-jupyter-core 548s python3-jupyterlab-pygments python3-matplotlib-inline python3-mistune 548s python3-nbclient python3-nbconvert python3-nbformat python3-nest-asyncio 548s python3-notebook python3-packaging python3-pandocfilters python3-parso 548s python3-pexpect python3-platformdirs python3-prometheus-client 548s python3-prompt-toolkit python3-psutil python3-ptyprocess python3-pure-eval 548s python3-py python3-pydevd python3-send2trash python3-soupsieve 548s python3-stack-data python3-terminado python3-tinycss2 python3-tornado 548s python3-traitlets python3-typeshed python3-wcwidth python3-webencodings 548s python3-zmq 548s 0 upgraded, 87 newly installed, 0 to remove and 0 not upgraded. 548s Need to get 26.7 MB/26.7 MB of archives. 548s After this operation, 150 MB of additional disk space will be used. 548s Get:1 /tmp/autopkgtest.s4beMp/3-autopkgtest-satdep.deb autopkgtest-satdep s390x 0 [716 B] 549s Get:2 http://ftpmaster.internal/ubuntu oracular/main s390x libdebuginfod-common all 0.191-1 [14.6 kB] 549s Get:3 http://ftpmaster.internal/ubuntu oracular/main s390x fonts-font-awesome all 5.0.10+really4.7.0~dfsg-4.1 [516 kB] 549s Get:4 http://ftpmaster.internal/ubuntu oracular/universe s390x fonts-glyphicons-halflings all 1.009~3.4.1+dfsg-3 [118 kB] 549s Get:5 http://ftpmaster.internal/ubuntu oracular/main s390x fonts-mathjax all 2.7.9+dfsg-1 [2208 kB] 549s Get:6 http://ftpmaster.internal/ubuntu oracular/main s390x libbabeltrace1 s390x 1.5.11-3build3 [173 kB] 549s Get:7 http://ftpmaster.internal/ubuntu oracular/main s390x libdebuginfod1t64 s390x 0.191-1 [17.6 kB] 549s Get:8 http://ftpmaster.internal/ubuntu oracular/main s390x libpython3.12t64 s390x 3.12.4-1 [2507 kB] 549s Get:9 http://ftpmaster.internal/ubuntu oracular/main s390x libsource-highlight-common all 3.1.9-4.3build1 [64.2 kB] 549s Get:10 http://ftpmaster.internal/ubuntu oracular/main s390x libsource-highlight4t64 s390x 3.1.9-4.3build1 [268 kB] 549s Get:11 http://ftpmaster.internal/ubuntu oracular/main s390x gdb s390x 15.0.50.20240403-0ubuntu1 [3899 kB] 549s Get:12 http://ftpmaster.internal/ubuntu oracular/main s390x libjs-underscore all 1.13.4~dfsg+~1.11.4-3 [118 kB] 549s Get:13 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-backbone all 1.4.1~dfsg+~1.4.15-3 [185 kB] 549s Get:14 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-bootstrap all 3.4.1+dfsg-3 [129 kB] 549s Get:15 http://ftpmaster.internal/ubuntu oracular/main s390x libjs-jquery all 3.6.1+dfsg+~3.5.14-1 [328 kB] 549s Get:16 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-bootstrap-tour all 0.12.0+dfsg-5 [21.4 kB] 549s Get:17 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-es6-promise all 4.2.8-12 [14.1 kB] 549s Get:18 http://ftpmaster.internal/ubuntu oracular/universe s390x node-jed all 1.1.1-4 [15.2 kB] 549s Get:19 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-jed all 1.1.1-4 [2584 B] 549s Get:20 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-jquery-typeahead all 2.11.0+dfsg1-3 [48.9 kB] 550s Get:21 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-jquery-ui all 1.13.2+dfsg-1 [252 kB] 550s Get:22 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-moment all 2.29.4+ds-1 [147 kB] 550s Get:23 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-text-encoding all 0.7.0-5 [140 kB] 550s Get:24 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-xterm all 5.3.0-2 [476 kB] 550s Get:25 http://ftpmaster.internal/ubuntu oracular/universe s390x libnorm1t64 s390x 1.5.9+dfsg-3.1build1 [158 kB] 550s Get:26 http://ftpmaster.internal/ubuntu oracular/universe s390x libpgm-5.3-0t64 s390x 5.3.128~dfsg-2.1build1 [169 kB] 550s Get:27 http://ftpmaster.internal/ubuntu oracular/main s390x libsodium23 s390x 1.0.18-1build3 [138 kB] 550s Get:28 http://ftpmaster.internal/ubuntu oracular/universe s390x libzmq5 s390x 4.3.5-1build2 [258 kB] 550s Get:29 http://ftpmaster.internal/ubuntu oracular/universe s390x python-tinycss2-common all 1.3.0-1 [34.1 kB] 550s Get:30 http://ftpmaster.internal/ubuntu oracular/main s390x python3-all s390x 3.12.3-0ubuntu1 [890 B] 550s Get:31 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-argon2 s390x 21.1.0-2build1 [21.2 kB] 550s Get:32 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-asttokens all 2.4.1-1 [20.9 kB] 550s Get:33 http://ftpmaster.internal/ubuntu oracular/main s390x python3-webencodings all 0.5.1-5 [11.5 kB] 550s Get:34 http://ftpmaster.internal/ubuntu oracular/main s390x python3-html5lib all 1.1-6 [88.8 kB] 550s Get:35 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-bleach all 6.1.0-2 [49.6 kB] 550s Get:36 http://ftpmaster.internal/ubuntu oracular/main s390x python3-soupsieve all 2.5-1 [33.0 kB] 550s Get:37 http://ftpmaster.internal/ubuntu oracular/main s390x python3-bs4 all 4.12.3-1 [109 kB] 550s Get:38 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-bytecode all 0.15.1-3 [44.7 kB] 550s Get:39 http://ftpmaster.internal/ubuntu oracular-proposed/universe s390x python3-traitlets all 5.14.3-1 [71.3 kB] 550s Get:40 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-comm all 0.2.1-1 [7016 B] 550s Get:41 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-coverage s390x 7.4.4+dfsg1-0ubuntu2 [147 kB] 550s Get:42 http://ftpmaster.internal/ubuntu oracular/main s390x python3-dateutil all 2.9.0-2 [80.3 kB] 550s Get:43 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-pydevd s390x 2.10.0+ds-10ubuntu1 [638 kB] 550s Get:44 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-debugpy all 1.8.0+ds-4ubuntu4 [67.6 kB] 550s Get:45 http://ftpmaster.internal/ubuntu oracular/main s390x python3-decorator all 5.1.1-5 [10.1 kB] 550s Get:46 http://ftpmaster.internal/ubuntu oracular/main s390x python3-defusedxml all 0.7.1-2 [42.0 kB] 550s Get:47 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-entrypoints all 0.4-2 [7146 B] 550s Get:48 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-executing all 2.0.1-0.1 [23.3 kB] 550s Get:49 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-fastjsonschema all 2.19.1-1 [19.7 kB] 550s Get:50 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-parso all 0.8.3-1 [67.2 kB] 550s Get:51 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-typeshed all 0.0~git20231111.6764465-3 [1274 kB] 550s Get:52 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-jedi all 0.19.1+ds1-1 [693 kB] 550s Get:53 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-matplotlib-inline all 0.1.6-2 [8784 B] 550s Get:54 http://ftpmaster.internal/ubuntu oracular/main s390x python3-ptyprocess all 0.7.0-5 [15.1 kB] 550s Get:55 http://ftpmaster.internal/ubuntu oracular/main s390x python3-pexpect all 4.9-2 [48.1 kB] 550s Get:56 http://ftpmaster.internal/ubuntu oracular/main s390x python3-wcwidth all 0.2.5+dfsg1-1.1ubuntu1 [22.5 kB] 550s Get:57 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-prompt-toolkit all 3.0.46-1 [256 kB] 550s Get:58 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-pure-eval all 0.2.2-2 [11.1 kB] 550s Get:59 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-stack-data all 0.6.3-1 [22.0 kB] 550s Get:60 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-ipython all 8.20.0-1ubuntu1 [561 kB] 550s Get:61 http://ftpmaster.internal/ubuntu oracular/main s390x python3-platformdirs all 4.2.1-1 [16.3 kB] 550s Get:62 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-jupyter-core all 5.3.2-2 [25.5 kB] 550s Get:63 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-nest-asyncio all 1.5.4-1 [6256 B] 550s Get:64 http://ftpmaster.internal/ubuntu oracular/main s390x python3-tornado s390x 6.4.1-1 [298 kB] 550s Get:65 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-py all 1.11.0-2 [72.7 kB] 550s Get:66 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-zmq s390x 24.0.1-5build1 [298 kB] 550s Get:67 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-jupyter-client all 7.4.9-2ubuntu1 [90.5 kB] 550s Get:68 http://ftpmaster.internal/ubuntu oracular/main s390x python3-packaging all 24.0-1 [41.1 kB] 550s Get:69 http://ftpmaster.internal/ubuntu oracular/main s390x python3-psutil s390x 5.9.8-2build2 [195 kB] 550s Get:70 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-ipykernel all 6.29.3-1ubuntu1 [82.6 kB] 550s Get:71 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-ipython-genutils all 0.2.0-6 [22.0 kB] 550s Get:72 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-jupyterlab-pygments all 0.2.2-3 [6054 B] 550s Get:73 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-mistune all 3.0.2-1 [32.8 kB] 550s Get:74 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-nbformat all 5.9.1-1 [41.2 kB] 550s Get:75 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-nbclient all 0.8.0-1 [55.6 kB] 550s Get:76 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-pandocfilters all 1.5.1-1 [23.6 kB] 550s Get:77 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-tinycss2 all 1.3.0-1 [19.6 kB] 550s Get:78 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-nbconvert all 7.16.4-1 [156 kB] 550s Get:79 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-codemirror all 5.65.0+~cs5.83.9-3 [755 kB] 550s Get:80 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-marked all 4.2.3+ds+~4.0.7-3 [36.2 kB] 550s Get:81 http://ftpmaster.internal/ubuntu oracular/main s390x libjs-mathjax all 2.7.9+dfsg-1 [5665 kB] 550s Get:82 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-requirejs all 2.3.6+ds+~2.1.37-1 [201 kB] 550s Get:83 http://ftpmaster.internal/ubuntu oracular/universe s390x libjs-requirejs-text all 2.0.12-1.1 [9056 B] 550s Get:84 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-terminado all 0.18.1-1 [13.2 kB] 550s Get:85 http://ftpmaster.internal/ubuntu oracular/main s390x python3-prometheus-client all 0.19.0+ds1-1 [41.7 kB] 550s Get:86 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-send2trash all 1.8.2-1 [15.5 kB] 550s Get:87 http://ftpmaster.internal/ubuntu oracular/universe s390x python3-notebook all 6.4.12-2.2ubuntu1 [1566 kB] 551s Preconfiguring packages ... 551s Fetched 26.7 MB in 2s (13.9 MB/s) 551s Selecting previously unselected package libdebuginfod-common. 551s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 54671 files and directories currently installed.) 551s Preparing to unpack .../00-libdebuginfod-common_0.191-1_all.deb ... 551s Unpacking libdebuginfod-common (0.191-1) ... 551s Selecting previously unselected package fonts-font-awesome. 551s Preparing to unpack .../01-fonts-font-awesome_5.0.10+really4.7.0~dfsg-4.1_all.deb ... 551s Unpacking fonts-font-awesome (5.0.10+really4.7.0~dfsg-4.1) ... 551s Selecting previously unselected package fonts-glyphicons-halflings. 551s Preparing to unpack .../02-fonts-glyphicons-halflings_1.009~3.4.1+dfsg-3_all.deb ... 551s Unpacking fonts-glyphicons-halflings (1.009~3.4.1+dfsg-3) ... 551s Selecting previously unselected package fonts-mathjax. 551s Preparing to unpack .../03-fonts-mathjax_2.7.9+dfsg-1_all.deb ... 551s Unpacking fonts-mathjax (2.7.9+dfsg-1) ... 551s Selecting previously unselected package libbabeltrace1:s390x. 551s Preparing to unpack .../04-libbabeltrace1_1.5.11-3build3_s390x.deb ... 551s Unpacking libbabeltrace1:s390x (1.5.11-3build3) ... 551s Selecting previously unselected package libdebuginfod1t64:s390x. 551s Preparing to unpack .../05-libdebuginfod1t64_0.191-1_s390x.deb ... 551s Unpacking libdebuginfod1t64:s390x (0.191-1) ... 551s Selecting previously unselected package libpython3.12t64:s390x. 551s Preparing to unpack .../06-libpython3.12t64_3.12.4-1_s390x.deb ... 551s Unpacking libpython3.12t64:s390x (3.12.4-1) ... 551s Selecting previously unselected package libsource-highlight-common. 551s Preparing to unpack .../07-libsource-highlight-common_3.1.9-4.3build1_all.deb ... 551s Unpacking libsource-highlight-common (3.1.9-4.3build1) ... 551s Selecting previously unselected package libsource-highlight4t64:s390x. 551s Preparing to unpack .../08-libsource-highlight4t64_3.1.9-4.3build1_s390x.deb ... 551s Unpacking libsource-highlight4t64:s390x (3.1.9-4.3build1) ... 551s Selecting previously unselected package gdb. 551s Preparing to unpack .../09-gdb_15.0.50.20240403-0ubuntu1_s390x.deb ... 551s Unpacking gdb (15.0.50.20240403-0ubuntu1) ... 552s Selecting previously unselected package libjs-underscore. 552s Preparing to unpack .../10-libjs-underscore_1.13.4~dfsg+~1.11.4-3_all.deb ... 552s Unpacking libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... 552s Selecting previously unselected package libjs-backbone. 552s Preparing to unpack .../11-libjs-backbone_1.4.1~dfsg+~1.4.15-3_all.deb ... 552s Unpacking libjs-backbone (1.4.1~dfsg+~1.4.15-3) ... 552s Selecting previously unselected package libjs-bootstrap. 552s Preparing to unpack .../12-libjs-bootstrap_3.4.1+dfsg-3_all.deb ... 552s Unpacking libjs-bootstrap (3.4.1+dfsg-3) ... 552s Selecting previously unselected package libjs-jquery. 552s Preparing to unpack .../13-libjs-jquery_3.6.1+dfsg+~3.5.14-1_all.deb ... 552s Unpacking libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 552s Selecting previously unselected package libjs-bootstrap-tour. 552s Preparing to unpack .../14-libjs-bootstrap-tour_0.12.0+dfsg-5_all.deb ... 552s Unpacking libjs-bootstrap-tour (0.12.0+dfsg-5) ... 552s Selecting previously unselected package libjs-es6-promise. 552s Preparing to unpack .../15-libjs-es6-promise_4.2.8-12_all.deb ... 552s Unpacking libjs-es6-promise (4.2.8-12) ... 552s Selecting previously unselected package node-jed. 552s Preparing to unpack .../16-node-jed_1.1.1-4_all.deb ... 552s Unpacking node-jed (1.1.1-4) ... 552s Selecting previously unselected package libjs-jed. 552s Preparing to unpack .../17-libjs-jed_1.1.1-4_all.deb ... 552s Unpacking libjs-jed (1.1.1-4) ... 552s Selecting previously unselected package libjs-jquery-typeahead. 552s Preparing to unpack .../18-libjs-jquery-typeahead_2.11.0+dfsg1-3_all.deb ... 552s Unpacking libjs-jquery-typeahead (2.11.0+dfsg1-3) ... 552s Selecting previously unselected package libjs-jquery-ui. 552s Preparing to unpack .../19-libjs-jquery-ui_1.13.2+dfsg-1_all.deb ... 552s Unpacking libjs-jquery-ui (1.13.2+dfsg-1) ... 552s Selecting previously unselected package libjs-moment. 552s Preparing to unpack .../20-libjs-moment_2.29.4+ds-1_all.deb ... 552s Unpacking libjs-moment (2.29.4+ds-1) ... 552s Selecting previously unselected package libjs-text-encoding. 552s Preparing to unpack .../21-libjs-text-encoding_0.7.0-5_all.deb ... 552s Unpacking libjs-text-encoding (0.7.0-5) ... 552s Selecting previously unselected package libjs-xterm. 552s Preparing to unpack .../22-libjs-xterm_5.3.0-2_all.deb ... 552s Unpacking libjs-xterm (5.3.0-2) ... 552s Selecting previously unselected package libnorm1t64:s390x. 552s Preparing to unpack .../23-libnorm1t64_1.5.9+dfsg-3.1build1_s390x.deb ... 552s Unpacking libnorm1t64:s390x (1.5.9+dfsg-3.1build1) ... 552s Selecting previously unselected package libpgm-5.3-0t64:s390x. 552s Preparing to unpack .../24-libpgm-5.3-0t64_5.3.128~dfsg-2.1build1_s390x.deb ... 552s Unpacking libpgm-5.3-0t64:s390x (5.3.128~dfsg-2.1build1) ... 552s Selecting previously unselected package libsodium23:s390x. 552s Preparing to unpack .../25-libsodium23_1.0.18-1build3_s390x.deb ... 552s Unpacking libsodium23:s390x (1.0.18-1build3) ... 552s Selecting previously unselected package libzmq5:s390x. 552s Preparing to unpack .../26-libzmq5_4.3.5-1build2_s390x.deb ... 552s Unpacking libzmq5:s390x (4.3.5-1build2) ... 552s Selecting previously unselected package python-tinycss2-common. 552s Preparing to unpack .../27-python-tinycss2-common_1.3.0-1_all.deb ... 552s Unpacking python-tinycss2-common (1.3.0-1) ... 552s Selecting previously unselected package python3-all. 552s Preparing to unpack .../28-python3-all_3.12.3-0ubuntu1_s390x.deb ... 552s Unpacking python3-all (3.12.3-0ubuntu1) ... 552s Selecting previously unselected package python3-argon2. 552s Preparing to unpack .../29-python3-argon2_21.1.0-2build1_s390x.deb ... 552s Unpacking python3-argon2 (21.1.0-2build1) ... 552s Selecting previously unselected package python3-asttokens. 552s Preparing to unpack .../30-python3-asttokens_2.4.1-1_all.deb ... 552s Unpacking python3-asttokens (2.4.1-1) ... 552s Selecting previously unselected package python3-webencodings. 552s Preparing to unpack .../31-python3-webencodings_0.5.1-5_all.deb ... 552s Unpacking python3-webencodings (0.5.1-5) ... 552s Selecting previously unselected package python3-html5lib. 552s Preparing to unpack .../32-python3-html5lib_1.1-6_all.deb ... 552s Unpacking python3-html5lib (1.1-6) ... 552s Selecting previously unselected package python3-bleach. 552s Preparing to unpack .../33-python3-bleach_6.1.0-2_all.deb ... 552s Unpacking python3-bleach (6.1.0-2) ... 552s Selecting previously unselected package python3-soupsieve. 552s Preparing to unpack .../34-python3-soupsieve_2.5-1_all.deb ... 552s Unpacking python3-soupsieve (2.5-1) ... 552s Selecting previously unselected package python3-bs4. 552s Preparing to unpack .../35-python3-bs4_4.12.3-1_all.deb ... 552s Unpacking python3-bs4 (4.12.3-1) ... 552s Selecting previously unselected package python3-bytecode. 552s Preparing to unpack .../36-python3-bytecode_0.15.1-3_all.deb ... 552s Unpacking python3-bytecode (0.15.1-3) ... 552s Selecting previously unselected package python3-traitlets. 552s Preparing to unpack .../37-python3-traitlets_5.14.3-1_all.deb ... 552s Unpacking python3-traitlets (5.14.3-1) ... 552s Selecting previously unselected package python3-comm. 552s Preparing to unpack .../38-python3-comm_0.2.1-1_all.deb ... 552s Unpacking python3-comm (0.2.1-1) ... 552s Selecting previously unselected package python3-coverage. 552s Preparing to unpack .../39-python3-coverage_7.4.4+dfsg1-0ubuntu2_s390x.deb ... 552s Unpacking python3-coverage (7.4.4+dfsg1-0ubuntu2) ... 552s Selecting previously unselected package python3-dateutil. 552s Preparing to unpack .../40-python3-dateutil_2.9.0-2_all.deb ... 552s Unpacking python3-dateutil (2.9.0-2) ... 552s Selecting previously unselected package python3-pydevd. 552s Preparing to unpack .../41-python3-pydevd_2.10.0+ds-10ubuntu1_s390x.deb ... 552s Unpacking python3-pydevd (2.10.0+ds-10ubuntu1) ... 552s Selecting previously unselected package python3-debugpy. 552s Preparing to unpack .../42-python3-debugpy_1.8.0+ds-4ubuntu4_all.deb ... 552s Unpacking python3-debugpy (1.8.0+ds-4ubuntu4) ... 552s Selecting previously unselected package python3-decorator. 552s Preparing to unpack .../43-python3-decorator_5.1.1-5_all.deb ... 552s Unpacking python3-decorator (5.1.1-5) ... 552s Selecting previously unselected package python3-defusedxml. 552s Preparing to unpack .../44-python3-defusedxml_0.7.1-2_all.deb ... 552s Unpacking python3-defusedxml (0.7.1-2) ... 552s Selecting previously unselected package python3-entrypoints. 552s Preparing to unpack .../45-python3-entrypoints_0.4-2_all.deb ... 552s Unpacking python3-entrypoints (0.4-2) ... 552s Selecting previously unselected package python3-executing. 552s Preparing to unpack .../46-python3-executing_2.0.1-0.1_all.deb ... 552s Unpacking python3-executing (2.0.1-0.1) ... 552s Selecting previously unselected package python3-fastjsonschema. 552s Preparing to unpack .../47-python3-fastjsonschema_2.19.1-1_all.deb ... 552s Unpacking python3-fastjsonschema (2.19.1-1) ... 552s Selecting previously unselected package python3-parso. 552s Preparing to unpack .../48-python3-parso_0.8.3-1_all.deb ... 552s Unpacking python3-parso (0.8.3-1) ... 552s Selecting previously unselected package python3-typeshed. 552s Preparing to unpack .../49-python3-typeshed_0.0~git20231111.6764465-3_all.deb ... 552s Unpacking python3-typeshed (0.0~git20231111.6764465-3) ... 553s Selecting previously unselected package python3-jedi. 553s Preparing to unpack .../50-python3-jedi_0.19.1+ds1-1_all.deb ... 553s Unpacking python3-jedi (0.19.1+ds1-1) ... 553s Selecting previously unselected package python3-matplotlib-inline. 553s Preparing to unpack .../51-python3-matplotlib-inline_0.1.6-2_all.deb ... 553s Unpacking python3-matplotlib-inline (0.1.6-2) ... 553s Selecting previously unselected package python3-ptyprocess. 553s Preparing to unpack .../52-python3-ptyprocess_0.7.0-5_all.deb ... 553s Unpacking python3-ptyprocess (0.7.0-5) ... 553s Selecting previously unselected package python3-pexpect. 553s Preparing to unpack .../53-python3-pexpect_4.9-2_all.deb ... 553s Unpacking python3-pexpect (4.9-2) ... 553s Selecting previously unselected package python3-wcwidth. 553s Preparing to unpack .../54-python3-wcwidth_0.2.5+dfsg1-1.1ubuntu1_all.deb ... 553s Unpacking python3-wcwidth (0.2.5+dfsg1-1.1ubuntu1) ... 553s Selecting previously unselected package python3-prompt-toolkit. 553s Preparing to unpack .../55-python3-prompt-toolkit_3.0.46-1_all.deb ... 553s Unpacking python3-prompt-toolkit (3.0.46-1) ... 553s Selecting previously unselected package python3-pure-eval. 553s Preparing to unpack .../56-python3-pure-eval_0.2.2-2_all.deb ... 553s Unpacking python3-pure-eval (0.2.2-2) ... 553s Selecting previously unselected package python3-stack-data. 553s Preparing to unpack .../57-python3-stack-data_0.6.3-1_all.deb ... 553s Unpacking python3-stack-data (0.6.3-1) ... 553s Selecting previously unselected package python3-ipython. 553s Preparing to unpack .../58-python3-ipython_8.20.0-1ubuntu1_all.deb ... 553s Unpacking python3-ipython (8.20.0-1ubuntu1) ... 553s Selecting previously unselected package python3-platformdirs. 553s Preparing to unpack .../59-python3-platformdirs_4.2.1-1_all.deb ... 553s Unpacking python3-platformdirs (4.2.1-1) ... 553s Selecting previously unselected package python3-jupyter-core. 553s Preparing to unpack .../60-python3-jupyter-core_5.3.2-2_all.deb ... 553s Unpacking python3-jupyter-core (5.3.2-2) ... 553s Selecting previously unselected package python3-nest-asyncio. 553s Preparing to unpack .../61-python3-nest-asyncio_1.5.4-1_all.deb ... 553s Unpacking python3-nest-asyncio (1.5.4-1) ... 553s Selecting previously unselected package python3-tornado. 553s Preparing to unpack .../62-python3-tornado_6.4.1-1_s390x.deb ... 553s Unpacking python3-tornado (6.4.1-1) ... 553s Selecting previously unselected package python3-py. 553s Preparing to unpack .../63-python3-py_1.11.0-2_all.deb ... 553s Unpacking python3-py (1.11.0-2) ... 553s Selecting previously unselected package python3-zmq. 553s Preparing to unpack .../64-python3-zmq_24.0.1-5build1_s390x.deb ... 553s Unpacking python3-zmq (24.0.1-5build1) ... 553s Selecting previously unselected package python3-jupyter-client. 553s Preparing to unpack .../65-python3-jupyter-client_7.4.9-2ubuntu1_all.deb ... 553s Unpacking python3-jupyter-client (7.4.9-2ubuntu1) ... 554s Selecting previously unselected package python3-packaging. 554s Preparing to unpack .../66-python3-packaging_24.0-1_all.deb ... 554s Unpacking python3-packaging (24.0-1) ... 554s Selecting previously unselected package python3-psutil. 554s Preparing to unpack .../67-python3-psutil_5.9.8-2build2_s390x.deb ... 554s Unpacking python3-psutil (5.9.8-2build2) ... 554s Selecting previously unselected package python3-ipykernel. 554s Preparing to unpack .../68-python3-ipykernel_6.29.3-1ubuntu1_all.deb ... 554s Unpacking python3-ipykernel (6.29.3-1ubuntu1) ... 554s Selecting previously unselected package python3-ipython-genutils. 554s Preparing to unpack .../69-python3-ipython-genutils_0.2.0-6_all.deb ... 554s Unpacking python3-ipython-genutils (0.2.0-6) ... 554s Selecting previously unselected package python3-jupyterlab-pygments. 554s Preparing to unpack .../70-python3-jupyterlab-pygments_0.2.2-3_all.deb ... 554s Unpacking python3-jupyterlab-pygments (0.2.2-3) ... 554s Selecting previously unselected package python3-mistune. 554s Preparing to unpack .../71-python3-mistune_3.0.2-1_all.deb ... 554s Unpacking python3-mistune (3.0.2-1) ... 554s Selecting previously unselected package python3-nbformat. 554s Preparing to unpack .../72-python3-nbformat_5.9.1-1_all.deb ... 554s Unpacking python3-nbformat (5.9.1-1) ... 554s Selecting previously unselected package python3-nbclient. 554s Preparing to unpack .../73-python3-nbclient_0.8.0-1_all.deb ... 554s Unpacking python3-nbclient (0.8.0-1) ... 554s Selecting previously unselected package python3-pandocfilters. 554s Preparing to unpack .../74-python3-pandocfilters_1.5.1-1_all.deb ... 554s Unpacking python3-pandocfilters (1.5.1-1) ... 554s Selecting previously unselected package python3-tinycss2. 554s Preparing to unpack .../75-python3-tinycss2_1.3.0-1_all.deb ... 554s Unpacking python3-tinycss2 (1.3.0-1) ... 554s Selecting previously unselected package python3-nbconvert. 554s Preparing to unpack .../76-python3-nbconvert_7.16.4-1_all.deb ... 554s Unpacking python3-nbconvert (7.16.4-1) ... 554s Selecting previously unselected package libjs-codemirror. 554s Preparing to unpack .../77-libjs-codemirror_5.65.0+~cs5.83.9-3_all.deb ... 554s Unpacking libjs-codemirror (5.65.0+~cs5.83.9-3) ... 554s Selecting previously unselected package libjs-marked. 554s Preparing to unpack .../78-libjs-marked_4.2.3+ds+~4.0.7-3_all.deb ... 554s Unpacking libjs-marked (4.2.3+ds+~4.0.7-3) ... 554s Selecting previously unselected package libjs-mathjax. 554s Preparing to unpack .../79-libjs-mathjax_2.7.9+dfsg-1_all.deb ... 554s Unpacking libjs-mathjax (2.7.9+dfsg-1) ... 555s Selecting previously unselected package libjs-requirejs. 555s Preparing to unpack .../80-libjs-requirejs_2.3.6+ds+~2.1.37-1_all.deb ... 555s Unpacking libjs-requirejs (2.3.6+ds+~2.1.37-1) ... 555s Selecting previously unselected package libjs-requirejs-text. 555s Preparing to unpack .../81-libjs-requirejs-text_2.0.12-1.1_all.deb ... 555s Unpacking libjs-requirejs-text (2.0.12-1.1) ... 555s Selecting previously unselected package python3-terminado. 555s Preparing to unpack .../82-python3-terminado_0.18.1-1_all.deb ... 555s Unpacking python3-terminado (0.18.1-1) ... 555s Selecting previously unselected package python3-prometheus-client. 555s Preparing to unpack .../83-python3-prometheus-client_0.19.0+ds1-1_all.deb ... 555s Unpacking python3-prometheus-client (0.19.0+ds1-1) ... 555s Selecting previously unselected package python3-send2trash. 555s Preparing to unpack .../84-python3-send2trash_1.8.2-1_all.deb ... 555s Unpacking python3-send2trash (1.8.2-1) ... 555s Selecting previously unselected package python3-notebook. 555s Preparing to unpack .../85-python3-notebook_6.4.12-2.2ubuntu1_all.deb ... 555s Unpacking python3-notebook (6.4.12-2.2ubuntu1) ... 555s Selecting previously unselected package autopkgtest-satdep. 555s Preparing to unpack .../86-3-autopkgtest-satdep.deb ... 555s Unpacking autopkgtest-satdep (0) ... 555s Setting up python3-entrypoints (0.4-2) ... 555s Setting up libjs-jquery-typeahead (2.11.0+dfsg1-3) ... 555s Setting up python3-tornado (6.4.1-1) ... 556s Setting up libnorm1t64:s390x (1.5.9+dfsg-3.1build1) ... 556s Setting up python3-pure-eval (0.2.2-2) ... 556s Setting up python3-send2trash (1.8.2-1) ... 556s Setting up fonts-mathjax (2.7.9+dfsg-1) ... 556s Setting up libsodium23:s390x (1.0.18-1build3) ... 556s Setting up libjs-mathjax (2.7.9+dfsg-1) ... 556s Setting up python3-py (1.11.0-2) ... 556s Setting up libdebuginfod-common (0.191-1) ... 556s Setting up libjs-requirejs-text (2.0.12-1.1) ... 556s Setting up python3-parso (0.8.3-1) ... 557s Setting up python3-defusedxml (0.7.1-2) ... 557s Setting up python3-ipython-genutils (0.2.0-6) ... 557s Setting up python3-asttokens (2.4.1-1) ... 557s Setting up fonts-glyphicons-halflings (1.009~3.4.1+dfsg-3) ... 557s Setting up python3-all (3.12.3-0ubuntu1) ... 557s Setting up python3-coverage (7.4.4+dfsg1-0ubuntu2) ... 557s Setting up libjs-moment (2.29.4+ds-1) ... 557s Setting up python3-pandocfilters (1.5.1-1) ... 557s Setting up libjs-requirejs (2.3.6+ds+~2.1.37-1) ... 557s Setting up libjs-es6-promise (4.2.8-12) ... 557s Setting up libjs-text-encoding (0.7.0-5) ... 557s Setting up python3-webencodings (0.5.1-5) ... 558s Setting up python3-platformdirs (4.2.1-1) ... 558s Setting up python3-psutil (5.9.8-2build2) ... 558s Setting up libsource-highlight-common (3.1.9-4.3build1) ... 558s Setting up python3-jupyterlab-pygments (0.2.2-3) ... 558s Setting up libpython3.12t64:s390x (3.12.4-1) ... 558s Setting up libpgm-5.3-0t64:s390x (5.3.128~dfsg-2.1build1) ... 558s Setting up python3-decorator (5.1.1-5) ... 558s Setting up python3-packaging (24.0-1) ... 558s Setting up python3-wcwidth (0.2.5+dfsg1-1.1ubuntu1) ... 559s Setting up node-jed (1.1.1-4) ... 559s Setting up python3-typeshed (0.0~git20231111.6764465-3) ... 559s Setting up python3-executing (2.0.1-0.1) ... 559s Setting up libjs-xterm (5.3.0-2) ... 559s Setting up python3-nest-asyncio (1.5.4-1) ... 559s Setting up python3-bytecode (0.15.1-3) ... 559s Setting up libjs-codemirror (5.65.0+~cs5.83.9-3) ... 559s Setting up libjs-jed (1.1.1-4) ... 559s Setting up python3-html5lib (1.1-6) ... 559s Setting up libbabeltrace1:s390x (1.5.11-3build3) ... 559s Setting up python3-fastjsonschema (2.19.1-1) ... 559s Setting up python3-traitlets (5.14.3-1) ... 560s Setting up python-tinycss2-common (1.3.0-1) ... 560s Setting up python3-argon2 (21.1.0-2build1) ... 560s Setting up python3-dateutil (2.9.0-2) ... 560s Setting up libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 560s Setting up python3-mistune (3.0.2-1) ... 560s Setting up python3-stack-data (0.6.3-1) ... 560s Setting up python3-soupsieve (2.5-1) ... 560s Setting up fonts-font-awesome (5.0.10+really4.7.0~dfsg-4.1) ... 560s Setting up python3-jupyter-core (5.3.2-2) ... 561s Setting up libjs-bootstrap (3.4.1+dfsg-3) ... 561s Setting up libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... 561s Setting up python3-ptyprocess (0.7.0-5) ... 561s Setting up libjs-marked (4.2.3+ds+~4.0.7-3) ... 561s Setting up python3-prompt-toolkit (3.0.46-1) ... 561s Setting up libdebuginfod1t64:s390x (0.191-1) ... 561s Setting up python3-tinycss2 (1.3.0-1) ... 561s Setting up libzmq5:s390x (4.3.5-1build2) ... 561s Setting up python3-jedi (0.19.1+ds1-1) ... 562s Setting up libjs-bootstrap-tour (0.12.0+dfsg-5) ... 562s Setting up libjs-backbone (1.4.1~dfsg+~1.4.15-3) ... 562s Setting up libsource-highlight4t64:s390x (3.1.9-4.3build1) ... 562s Setting up python3-nbformat (5.9.1-1) ... 562s Setting up python3-bs4 (4.12.3-1) ... 562s Setting up python3-bleach (6.1.0-2) ... 562s Setting up python3-matplotlib-inline (0.1.6-2) ... 562s Setting up python3-comm (0.2.1-1) ... 562s Setting up python3-prometheus-client (0.19.0+ds1-1) ... 562s Setting up gdb (15.0.50.20240403-0ubuntu1) ... 562s Setting up libjs-jquery-ui (1.13.2+dfsg-1) ... 562s Setting up python3-pexpect (4.9-2) ... 563s Setting up python3-zmq (24.0.1-5build1) ... 563s Setting up python3-terminado (0.18.1-1) ... 563s Setting up python3-jupyter-client (7.4.9-2ubuntu1) ... 563s Setting up python3-pydevd (2.10.0+ds-10ubuntu1) ... 564s Setting up python3-debugpy (1.8.0+ds-4ubuntu4) ... 564s Setting up python3-nbclient (0.8.0-1) ... 564s Setting up python3-ipython (8.20.0-1ubuntu1) ... 565s Setting up python3-ipykernel (6.29.3-1ubuntu1) ... 565s Setting up python3-nbconvert (7.16.4-1) ... 565s Setting up python3-notebook (6.4.12-2.2ubuntu1) ... 566s Setting up autopkgtest-satdep (0) ... 566s Processing triggers for man-db (2.12.1-2) ... 566s Processing triggers for libc-bin (2.39-0ubuntu9) ... 569s (Reading database ... 70872 files and directories currently installed.) 569s Removing autopkgtest-satdep (0) ... 571s autopkgtest [10:33:39]: test autodep8-python3: set -e ; for py in $(py3versions -r 2>/dev/null) ; do cd "$AUTOPKGTEST_TMP" ; echo "Testing with $py:" ; $py -c "import notebook; print(notebook)" ; done 571s autopkgtest [10:33:39]: test autodep8-python3: [----------------------- 571s Testing with python3.12: 572s 572s autopkgtest [10:33:40]: test autodep8-python3: -----------------------] 572s autodep8-python3 PASS (superficial) 572s autopkgtest [10:33:40]: test autodep8-python3: - - - - - - - - - - results - - - - - - - - - - 573s autopkgtest [10:33:41]: @@@@@@@@@@@@@@@@@@@@ summary 573s pytest FAIL non-zero exit status 1 573s command1 PASS (superficial) 573s autodep8-python3 PASS (superficial) 608s nova [W] Using flock in scalingstack-bos01-s390x 608s Creating nova instance adt-oracular-s390x-jupyter-notebook-20240616-102408-juju-7f2275-prod-proposed-migration-environment-2-20636b6c-7120-4349-8d7d-28bda5c08416 from image adt/ubuntu-oracular-s390x-server-20240616.img (UUID d6f70b60-e0c1-480d-8231-cedbbc2f917e)... 608s nova [W] Using flock in scalingstack-bos01-s390x 608s Creating nova instance adt-oracular-s390x-jupyter-notebook-20240616-102408-juju-7f2275-prod-proposed-migration-environment-2-20636b6c-7120-4349-8d7d-28bda5c08416 from image adt/ubuntu-oracular-s390x-server-20240616.img (UUID d6f70b60-e0c1-480d-8231-cedbbc2f917e)... 608s nova [W] Using flock in scalingstack-bos01-s390x 608s Creating nova instance adt-oracular-s390x-jupyter-notebook-20240616-102408-juju-7f2275-prod-proposed-migration-environment-2-20636b6c-7120-4349-8d7d-28bda5c08416 from image adt/ubuntu-oracular-s390x-server-20240616.img (UUID d6f70b60-e0c1-480d-8231-cedbbc2f917e)...