0s autopkgtest [08:42:27]: starting date and time: 2025-06-30 08:42:27+0000 0s autopkgtest [08:42:27]: git checkout: 508d4a25 a-v-ssh wait_for_ssh: demote "ssh connection failed" to a debug message 0s autopkgtest [08:42:27]: host juju-7f2275-prod-proposed-migration-environment-20; command line: /home/ubuntu/autopkgtest/runner/autopkgtest --output-dir /tmp/autopkgtest-work.4sx1s70b/out --timeout-copy=6000 --setup-commands /home/ubuntu/autopkgtest-cloud/worker-config-production/setup-canonical.sh --apt-pocket=proposed=src:fsspec --apt-upgrade fsspec --timeout-short=300 --timeout-copy=20000 --timeout-build=20000 --env=ADT_TEST_TRIGGERS=fsspec/2025.3.2-1 -- ssh -s /home/ubuntu/autopkgtest/ssh-setup/nova -- --flavor autopkgtest-cpu2-ram4-disk20-ppc64el --security-groups autopkgtest-juju-7f2275-prod-proposed-migration-environment-20@sto01-ppc64el-14.secgroup --name adt-questing-ppc64el-fsspec-20250630-082131-juju-7f2275-prod-proposed-migration-environment-20-7e28c4e1-470e-4be6-afa4-3ce970732b74 --image adt/ubuntu-questing-ppc64el-server --keyname testbed-juju-7f2275-prod-proposed-migration-environment-20 --net-id=net_prod-autopkgtest-workers-ppc64el -e TERM=linux --mirror=http://ftpmaster.internal/ubuntu/ 2s Creating nova instance adt-questing-ppc64el-fsspec-20250630-082131-juju-7f2275-prod-proposed-migration-environment-20-7e28c4e1-470e-4be6-afa4-3ce970732b74 from image adt/ubuntu-questing-ppc64el-server-20250630.img (UUID 62ece32a-a77f-4f90-83a4-cbbec604149c)... 45s autopkgtest [08:43:12]: testbed dpkg architecture: ppc64el 45s autopkgtest [08:43:12]: testbed apt version: 3.1.2 46s autopkgtest [08:43:13]: @@@@@@@@@@@@@@@@@@@@ test bed setup 46s autopkgtest [08:43:13]: testbed release detected to be: None 47s autopkgtest [08:43:14]: updating testbed package index (apt update) 47s Get:1 http://ftpmaster.internal/ubuntu questing-proposed InRelease [249 kB] 47s Hit:2 http://ftpmaster.internal/ubuntu questing InRelease 47s Hit:3 http://ftpmaster.internal/ubuntu questing-updates InRelease 47s Hit:4 http://ftpmaster.internal/ubuntu questing-security InRelease 47s Get:5 http://ftpmaster.internal/ubuntu questing-proposed/universe Sources [429 kB] 47s Get:6 http://ftpmaster.internal/ubuntu questing-proposed/multiverse Sources [17.5 kB] 47s Get:7 http://ftpmaster.internal/ubuntu questing-proposed/main Sources [26.6 kB] 47s Get:8 http://ftpmaster.internal/ubuntu questing-proposed/main ppc64el Packages [33.1 kB] 47s Get:9 http://ftpmaster.internal/ubuntu questing-proposed/universe ppc64el Packages [375 kB] 47s Get:10 http://ftpmaster.internal/ubuntu questing-proposed/multiverse ppc64el Packages [5260 B] 47s Fetched 1136 kB in 0s (2308 kB/s) 48s Reading package lists... 49s autopkgtest [08:43:16]: upgrading testbed (apt dist-upgrade and autopurge) 49s Reading package lists... 49s Building dependency tree... 49s Reading state information... 49s Calculating upgrade... 49s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 49s Reading package lists... 49s Building dependency tree... 49s Reading state information... 49s Solving dependencies... 49s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 52s autopkgtest [08:43:19]: testbed running kernel: Linux 6.15.0-3-generic #3-Ubuntu SMP Wed Jun 4 08:35:52 UTC 2025 52s autopkgtest [08:43:19]: @@@@@@@@@@@@@@@@@@@@ apt-source fsspec 54s Get:1 http://ftpmaster.internal/ubuntu questing-proposed/universe fsspec 2025.3.2-1 (dsc) [2580 B] 54s Get:2 http://ftpmaster.internal/ubuntu questing-proposed/universe fsspec 2025.3.2-1 (tar) [432 kB] 54s Get:3 http://ftpmaster.internal/ubuntu questing-proposed/universe fsspec 2025.3.2-1 (diff) [7208 B] 54s gpgv: Signature made Fri Apr 4 17:43:51 2025 UTC 54s gpgv: using RSA key 13796755BBC72BB8ABE2AEB5FA9DEC5DE11C63F1 54s gpgv: issuer "eamanu@debian.org" 54s gpgv: Can't check signature: No public key 54s dpkg-source: warning: cannot verify inline signature for ./fsspec_2025.3.2-1.dsc: no acceptable signature found 54s autopkgtest [08:43:21]: testing package fsspec version 2025.3.2-1 54s autopkgtest [08:43:21]: build not needed 55s autopkgtest [08:43:22]: test fsspec-tests: preparing testbed 55s Reading package lists... 55s Building dependency tree... 55s Reading state information... 55s Solving dependencies... 55s The following NEW packages will be installed: 55s fonts-font-awesome fonts-lato libblas3 libgfortran5 libjs-jquery 55s libjs-sphinxdoc libjs-underscore liblapack3 python-fsspec-doc 55s python3-aiohappyeyeballs python3-aiohttp python3-aiosignal python3-all 55s python3-async-generator python3-async-timeout python3-frozenlist 55s python3-fsspec python3-iniconfig python3-multidict python3-numpy 55s python3-numpy-dev python3-pluggy python3-propcache python3-pytest 55s python3-pytest-asyncio python3-pytest-mock python3-pytest-vcr python3-tqdm 55s python3-vcr python3-wrapt python3-yarl sphinx-rtd-theme-common 55s 0 upgraded, 32 newly installed, 0 to remove and 0 not upgraded. 55s Need to get 15.2 MB of archives. 55s After this operation, 71.4 MB of additional disk space will be used. 55s Get:1 http://ftpmaster.internal/ubuntu questing/main ppc64el fonts-lato all 2.015-1 [2781 kB] 56s Get:2 http://ftpmaster.internal/ubuntu questing/main ppc64el python3-numpy-dev ppc64el 1:2.2.4+ds-1ubuntu1 [153 kB] 56s Get:3 http://ftpmaster.internal/ubuntu questing/main ppc64el libblas3 ppc64el 3.12.1-2build1 [239 kB] 56s Get:4 http://ftpmaster.internal/ubuntu questing/main ppc64el libgfortran5 ppc64el 15.1.0-8ubuntu1 [620 kB] 56s Get:5 http://ftpmaster.internal/ubuntu questing/main ppc64el liblapack3 ppc64el 3.12.1-2build1 [2817 kB] 56s Get:6 http://ftpmaster.internal/ubuntu questing/main ppc64el python3-numpy ppc64el 1:2.2.4+ds-1ubuntu1 [4887 kB] 56s Get:7 http://ftpmaster.internal/ubuntu questing/main ppc64el fonts-font-awesome all 5.0.10+really4.7.0~dfsg-4.1 [516 kB] 56s Get:8 http://ftpmaster.internal/ubuntu questing/main ppc64el libjs-jquery all 3.6.1+dfsg+~3.5.14-1 [328 kB] 56s Get:9 http://ftpmaster.internal/ubuntu questing/main ppc64el libjs-underscore all 1.13.4~dfsg+~1.11.4-3 [118 kB] 56s Get:10 http://ftpmaster.internal/ubuntu questing/main ppc64el libjs-sphinxdoc all 8.2.3-1ubuntu2 [28.0 kB] 56s Get:11 http://ftpmaster.internal/ubuntu questing/main ppc64el sphinx-rtd-theme-common all 3.0.2+dfsg-3 [1013 kB] 56s Get:12 http://ftpmaster.internal/ubuntu questing-proposed/universe ppc64el python-fsspec-doc all 2025.3.2-1 [321 kB] 56s Get:13 http://ftpmaster.internal/ubuntu questing/universe ppc64el python3-aiohappyeyeballs all 2.6.1-1 [11.1 kB] 56s Get:14 http://ftpmaster.internal/ubuntu questing/universe ppc64el python3-multidict ppc64el 6.4.3-1 [52.8 kB] 56s Get:15 http://ftpmaster.internal/ubuntu questing/universe ppc64el python3-propcache ppc64el 0.3.1-1 [43.7 kB] 56s Get:16 http://ftpmaster.internal/ubuntu questing/universe ppc64el python3-yarl ppc64el 1.19.0-1 [93.2 kB] 56s Get:17 http://ftpmaster.internal/ubuntu questing/universe ppc64el python3-async-timeout all 5.0.1-1 [6830 B] 56s Get:18 http://ftpmaster.internal/ubuntu questing/universe ppc64el python3-frozenlist ppc64el 1.6.0-1 [104 kB] 56s Get:19 http://ftpmaster.internal/ubuntu questing/universe ppc64el python3-aiosignal all 1.3.2-1 [5182 B] 56s Get:20 http://ftpmaster.internal/ubuntu questing/universe ppc64el python3-aiohttp ppc64el 3.11.16-1 [373 kB] 56s Get:21 http://ftpmaster.internal/ubuntu questing/main ppc64el python3-all ppc64el 3.13.4-1 [880 B] 56s Get:22 http://ftpmaster.internal/ubuntu questing/universe ppc64el python3-async-generator all 1.10-4 [17.5 kB] 56s Get:23 http://ftpmaster.internal/ubuntu questing-proposed/universe ppc64el python3-fsspec all 2025.3.2-1 [217 kB] 56s Get:24 http://ftpmaster.internal/ubuntu questing/universe ppc64el python3-iniconfig all 1.1.1-2 [6024 B] 56s Get:25 http://ftpmaster.internal/ubuntu questing/universe ppc64el python3-pluggy all 1.5.0-1 [21.0 kB] 56s Get:26 http://ftpmaster.internal/ubuntu questing/universe ppc64el python3-pytest all 8.3.5-2 [252 kB] 56s Get:27 http://ftpmaster.internal/ubuntu questing/universe ppc64el python3-pytest-asyncio all 0.25.1-1 [17.0 kB] 56s Get:28 http://ftpmaster.internal/ubuntu questing/universe ppc64el python3-pytest-mock all 3.14.0-2 [11.7 kB] 56s Get:29 http://ftpmaster.internal/ubuntu questing/main ppc64el python3-wrapt ppc64el 1.15.0-4build1 [35.7 kB] 56s Get:30 http://ftpmaster.internal/ubuntu questing/universe ppc64el python3-vcr all 7.0.0-2 [33.3 kB] 56s Get:31 http://ftpmaster.internal/ubuntu questing/universe ppc64el python3-pytest-vcr all 1.0.2-4 [5228 B] 56s Get:32 http://ftpmaster.internal/ubuntu questing/universe ppc64el python3-tqdm all 4.67.1-5 [92.1 kB] 56s Fetched 15.2 MB in 1s (18.9 MB/s) 56s Selecting previously unselected package fonts-lato. 57s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117841 files and directories currently installed.) 57s Preparing to unpack .../00-fonts-lato_2.015-1_all.deb ... 57s Unpacking fonts-lato (2.015-1) ... 57s Selecting previously unselected package python3-numpy-dev:ppc64el. 57s Preparing to unpack .../01-python3-numpy-dev_1%3a2.2.4+ds-1ubuntu1_ppc64el.deb ... 57s Unpacking python3-numpy-dev:ppc64el (1:2.2.4+ds-1ubuntu1) ... 57s Selecting previously unselected package libblas3:ppc64el. 57s Preparing to unpack .../02-libblas3_3.12.1-2build1_ppc64el.deb ... 57s Unpacking libblas3:ppc64el (3.12.1-2build1) ... 57s Selecting previously unselected package libgfortran5:ppc64el. 57s Preparing to unpack .../03-libgfortran5_15.1.0-8ubuntu1_ppc64el.deb ... 57s Unpacking libgfortran5:ppc64el (15.1.0-8ubuntu1) ... 57s Selecting previously unselected package liblapack3:ppc64el. 57s Preparing to unpack .../04-liblapack3_3.12.1-2build1_ppc64el.deb ... 57s Unpacking liblapack3:ppc64el (3.12.1-2build1) ... 57s Selecting previously unselected package python3-numpy. 57s Preparing to unpack .../05-python3-numpy_1%3a2.2.4+ds-1ubuntu1_ppc64el.deb ... 57s Unpacking python3-numpy (1:2.2.4+ds-1ubuntu1) ... 57s Selecting previously unselected package fonts-font-awesome. 57s Preparing to unpack .../06-fonts-font-awesome_5.0.10+really4.7.0~dfsg-4.1_all.deb ... 57s Unpacking fonts-font-awesome (5.0.10+really4.7.0~dfsg-4.1) ... 57s Selecting previously unselected package libjs-jquery. 57s Preparing to unpack .../07-libjs-jquery_3.6.1+dfsg+~3.5.14-1_all.deb ... 57s Unpacking libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 57s Selecting previously unselected package libjs-underscore. 57s Preparing to unpack .../08-libjs-underscore_1.13.4~dfsg+~1.11.4-3_all.deb ... 57s Unpacking libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... 57s Selecting previously unselected package libjs-sphinxdoc. 57s Preparing to unpack .../09-libjs-sphinxdoc_8.2.3-1ubuntu2_all.deb ... 57s Unpacking libjs-sphinxdoc (8.2.3-1ubuntu2) ... 57s Selecting previously unselected package sphinx-rtd-theme-common. 57s Preparing to unpack .../10-sphinx-rtd-theme-common_3.0.2+dfsg-3_all.deb ... 57s Unpacking sphinx-rtd-theme-common (3.0.2+dfsg-3) ... 57s Selecting previously unselected package python-fsspec-doc. 57s Preparing to unpack .../11-python-fsspec-doc_2025.3.2-1_all.deb ... 57s Unpacking python-fsspec-doc (2025.3.2-1) ... 57s Selecting previously unselected package python3-aiohappyeyeballs. 57s Preparing to unpack .../12-python3-aiohappyeyeballs_2.6.1-1_all.deb ... 57s Unpacking python3-aiohappyeyeballs (2.6.1-1) ... 57s Selecting previously unselected package python3-multidict. 57s Preparing to unpack .../13-python3-multidict_6.4.3-1_ppc64el.deb ... 57s Unpacking python3-multidict (6.4.3-1) ... 57s Selecting previously unselected package python3-propcache. 57s Preparing to unpack .../14-python3-propcache_0.3.1-1_ppc64el.deb ... 57s Unpacking python3-propcache (0.3.1-1) ... 57s Selecting previously unselected package python3-yarl. 57s Preparing to unpack .../15-python3-yarl_1.19.0-1_ppc64el.deb ... 57s Unpacking python3-yarl (1.19.0-1) ... 57s Selecting previously unselected package python3-async-timeout. 57s Preparing to unpack .../16-python3-async-timeout_5.0.1-1_all.deb ... 57s Unpacking python3-async-timeout (5.0.1-1) ... 57s Selecting previously unselected package python3-frozenlist. 57s Preparing to unpack .../17-python3-frozenlist_1.6.0-1_ppc64el.deb ... 57s Unpacking python3-frozenlist (1.6.0-1) ... 57s Selecting previously unselected package python3-aiosignal. 57s Preparing to unpack .../18-python3-aiosignal_1.3.2-1_all.deb ... 57s Unpacking python3-aiosignal (1.3.2-1) ... 57s Selecting previously unselected package python3-aiohttp. 57s Preparing to unpack .../19-python3-aiohttp_3.11.16-1_ppc64el.deb ... 57s Unpacking python3-aiohttp (3.11.16-1) ... 57s Selecting previously unselected package python3-all. 57s Preparing to unpack .../20-python3-all_3.13.4-1_ppc64el.deb ... 57s Unpacking python3-all (3.13.4-1) ... 57s Selecting previously unselected package python3-async-generator. 57s Preparing to unpack .../21-python3-async-generator_1.10-4_all.deb ... 57s Unpacking python3-async-generator (1.10-4) ... 57s Selecting previously unselected package python3-fsspec. 57s Preparing to unpack .../22-python3-fsspec_2025.3.2-1_all.deb ... 57s Unpacking python3-fsspec (2025.3.2-1) ... 57s Selecting previously unselected package python3-iniconfig. 57s Preparing to unpack .../23-python3-iniconfig_1.1.1-2_all.deb ... 57s Unpacking python3-iniconfig (1.1.1-2) ... 57s Selecting previously unselected package python3-pluggy. 57s Preparing to unpack .../24-python3-pluggy_1.5.0-1_all.deb ... 57s Unpacking python3-pluggy (1.5.0-1) ... 58s Selecting previously unselected package python3-pytest. 58s Preparing to unpack .../25-python3-pytest_8.3.5-2_all.deb ... 58s Unpacking python3-pytest (8.3.5-2) ... 58s Selecting previously unselected package python3-pytest-asyncio. 58s Preparing to unpack .../26-python3-pytest-asyncio_0.25.1-1_all.deb ... 58s Unpacking python3-pytest-asyncio (0.25.1-1) ... 58s Selecting previously unselected package python3-pytest-mock. 58s Preparing to unpack .../27-python3-pytest-mock_3.14.0-2_all.deb ... 58s Unpacking python3-pytest-mock (3.14.0-2) ... 58s Selecting previously unselected package python3-wrapt. 58s Preparing to unpack .../28-python3-wrapt_1.15.0-4build1_ppc64el.deb ... 58s Unpacking python3-wrapt (1.15.0-4build1) ... 58s Selecting previously unselected package python3-vcr. 58s Preparing to unpack .../29-python3-vcr_7.0.0-2_all.deb ... 58s Unpacking python3-vcr (7.0.0-2) ... 58s Selecting previously unselected package python3-pytest-vcr. 58s Preparing to unpack .../30-python3-pytest-vcr_1.0.2-4_all.deb ... 58s Unpacking python3-pytest-vcr (1.0.2-4) ... 58s Selecting previously unselected package python3-tqdm. 58s Preparing to unpack .../31-python3-tqdm_4.67.1-5_all.deb ... 58s Unpacking python3-tqdm (4.67.1-5) ... 58s Setting up python3-iniconfig (1.1.1-2) ... 58s Setting up fonts-lato (2.015-1) ... 58s Setting up python3-async-generator (1.10-4) ... 58s Setting up python3-fsspec (2025.3.2-1) ... 58s Setting up python3-tqdm (4.67.1-5) ... 58s Setting up python3-all (3.13.4-1) ... 58s Setting up python3-multidict (6.4.3-1) ... 58s Setting up python3-frozenlist (1.6.0-1) ... 59s Setting up python3-aiosignal (1.3.2-1) ... 59s Setting up python3-async-timeout (5.0.1-1) ... 59s Setting up libblas3:ppc64el (3.12.1-2build1) ... 59s update-alternatives: using /usr/lib/powerpc64le-linux-gnu/blas/libblas.so.3 to provide /usr/lib/powerpc64le-linux-gnu/libblas.so.3 (libblas.so.3-powerpc64le-linux-gnu) in auto mode 59s Setting up python3-numpy-dev:ppc64el (1:2.2.4+ds-1ubuntu1) ... 59s Setting up python3-wrapt (1.15.0-4build1) ... 59s Setting up python3-aiohappyeyeballs (2.6.1-1) ... 59s Setting up libgfortran5:ppc64el (15.1.0-8ubuntu1) ... 59s Setting up python3-pluggy (1.5.0-1) ... 59s Setting up python3-propcache (0.3.1-1) ... 59s Setting up libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 59s Setting up python3-yarl (1.19.0-1) ... 59s Setting up fonts-font-awesome (5.0.10+really4.7.0~dfsg-4.1) ... 59s Setting up sphinx-rtd-theme-common (3.0.2+dfsg-3) ... 59s Setting up libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... 59s Setting up liblapack3:ppc64el (3.12.1-2build1) ... 59s update-alternatives: using /usr/lib/powerpc64le-linux-gnu/lapack/liblapack.so.3 to provide /usr/lib/powerpc64le-linux-gnu/liblapack.so.3 (liblapack.so.3-powerpc64le-linux-gnu) in auto mode 59s Setting up python3-pytest (8.3.5-2) ... 60s Setting up python3-aiohttp (3.11.16-1) ... 60s Setting up python3-vcr (7.0.0-2) ... 60s Setting up python3-numpy (1:2.2.4+ds-1ubuntu1) ... 62s Setting up libjs-sphinxdoc (8.2.3-1ubuntu2) ... 62s Setting up python3-pytest-asyncio (0.25.1-1) ... 62s Setting up python3-pytest-mock (3.14.0-2) ... 62s Setting up python3-pytest-vcr (1.0.2-4) ... 62s Setting up python-fsspec-doc (2025.3.2-1) ... 62s Processing triggers for man-db (2.13.1-1) ... 63s Processing triggers for libc-bin (2.41-6ubuntu2) ... 63s autopkgtest [08:43:30]: test fsspec-tests: [----------------------- 64s 'fsspec/tests' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/tests' 64s 'fsspec/tests/__init__.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/tests/__init__.py' 64s 'fsspec/tests/abstract' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/tests/abstract' 64s 'fsspec/tests/abstract/__init__.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/tests/abstract/__init__.py' 64s 'fsspec/tests/abstract/common.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/tests/abstract/common.py' 64s 'fsspec/tests/abstract/copy.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/tests/abstract/copy.py' 64s 'fsspec/tests/abstract/get.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/tests/abstract/get.py' 64s 'fsspec/tests/abstract/mv.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/tests/abstract/mv.py' 64s 'fsspec/tests/abstract/open.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/tests/abstract/open.py' 64s 'fsspec/tests/abstract/pipe.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/tests/abstract/pipe.py' 64s 'fsspec/tests/abstract/put.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/tests/abstract/put.py' 64s 'fsspec/tests/data' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/tests/data' 64s 'fsspec/tests/data/listing.html' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/tests/data/listing.html' 64s 'fsspec/tests/test_api.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/tests/test_api.py' 64s 'fsspec/tests/test_async.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/tests/test_async.py' 64s 'fsspec/tests/test_caches.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/tests/test_caches.py' 64s 'fsspec/tests/test_callbacks.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/tests/test_callbacks.py' 64s 'fsspec/tests/test_compression.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/tests/test_compression.py' 64s 'fsspec/tests/test_config.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/tests/test_config.py' 64s 'fsspec/tests/test_core.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/tests/test_core.py' 64s 'fsspec/tests/test_downstream.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/tests/test_downstream.py' 64s 'fsspec/tests/test_file.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/tests/test_file.py' 64s 'fsspec/tests/test_fuse.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/tests/test_fuse.py' 64s 'fsspec/tests/test_generic.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/tests/test_generic.py' 64s 'fsspec/tests/test_gui.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/tests/test_gui.py' 64s 'fsspec/tests/test_mapping.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/tests/test_mapping.py' 64s 'fsspec/tests/test_parquet.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/tests/test_parquet.py' 64s 'fsspec/tests/test_registry.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/tests/test_registry.py' 64s 'fsspec/tests/test_spec.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/tests/test_spec.py' 64s 'fsspec/tests/test_utils.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/tests/test_utils.py' 64s 'fsspec/tests/conftest.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/tests/conftest.py' 64s 'fsspec/implementations/tests' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests' 64s 'fsspec/implementations/tests/__init__.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/__init__.py' 64s 'fsspec/implementations/tests/cassettes' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/cassettes' 64s 'fsspec/implementations/tests/cassettes/test_dbfs' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/cassettes/test_dbfs' 64s 'fsspec/implementations/tests/cassettes/test_dbfs/test_dbfs_file_listing.yaml' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/cassettes/test_dbfs/test_dbfs_file_listing.yaml' 64s 'fsspec/implementations/tests/cassettes/test_dbfs/test_dbfs_mkdir.yaml' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/cassettes/test_dbfs/test_dbfs_mkdir.yaml' 64s 'fsspec/implementations/tests/cassettes/test_dbfs/test_dbfs_read_pyarrow_non_partitioned.yaml' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/cassettes/test_dbfs/test_dbfs_read_pyarrow_non_partitioned.yaml' 64s 'fsspec/implementations/tests/cassettes/test_dbfs/test_dbfs_read_range.yaml' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/cassettes/test_dbfs/test_dbfs_read_range.yaml' 64s 'fsspec/implementations/tests/cassettes/test_dbfs/test_dbfs_read_range_chunked.yaml' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/cassettes/test_dbfs/test_dbfs_read_range_chunked.yaml' 64s 'fsspec/implementations/tests/cassettes/test_dbfs/test_dbfs_write_and_read.yaml' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/cassettes/test_dbfs/test_dbfs_write_and_read.yaml' 64s 'fsspec/implementations/tests/cassettes/test_dbfs/test_dbfs_write_pyarrow_non_partitioned.yaml' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/cassettes/test_dbfs/test_dbfs_write_pyarrow_non_partitioned.yaml' 64s 'fsspec/implementations/tests/conftest.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/conftest.py' 64s 'fsspec/implementations/tests/ftp_tls.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/ftp_tls.py' 64s 'fsspec/implementations/tests/keycert.pem' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/keycert.pem' 64s 'fsspec/implementations/tests/local' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/local' 64s 'fsspec/implementations/tests/local/__init__.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/local/__init__.py' 64s 'fsspec/implementations/tests/local/local_fixtures.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/local/local_fixtures.py' 64s 'fsspec/implementations/tests/local/local_test.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/local/local_test.py' 64s 'fsspec/implementations/tests/memory' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/memory' 64s 'fsspec/implementations/tests/memory/__init__.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/memory/__init__.py' 64s 'fsspec/implementations/tests/memory/memory_fixtures.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/memory/memory_fixtures.py' 64s 'fsspec/implementations/tests/memory/memory_test.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/memory/memory_test.py' 64s 'fsspec/implementations/tests/out.zip' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/out.zip' 64s 'fsspec/implementations/tests/test_archive.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/test_archive.py' 64s 'fsspec/implementations/tests/test_arrow.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/test_arrow.py' 64s 'fsspec/implementations/tests/test_asyn_wrapper.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/test_asyn_wrapper.py' 64s 'fsspec/implementations/tests/test_cached.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/test_cached.py' 64s 'fsspec/implementations/tests/test_common.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/test_common.py' 64s 'fsspec/implementations/tests/test_dask.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/test_dask.py' 64s 'fsspec/implementations/tests/test_data.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/test_data.py' 64s 'fsspec/implementations/tests/test_dbfs.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/test_dbfs.py' 64s 'fsspec/implementations/tests/test_dirfs.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/test_dirfs.py' 64s 'fsspec/implementations/tests/test_ftp.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/test_ftp.py' 64s 'fsspec/implementations/tests/test_git.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/test_git.py' 64s 'fsspec/implementations/tests/test_github.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/test_github.py' 64s 'fsspec/implementations/tests/test_http.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/test_http.py' 64s 'fsspec/implementations/tests/test_http_sync.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/test_http_sync.py' 64s 'fsspec/implementations/tests/test_jupyter.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/test_jupyter.py' 64s 'fsspec/implementations/tests/test_libarchive.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/test_libarchive.py' 64s 'fsspec/implementations/tests/test_local.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/test_local.py' 64s 'fsspec/implementations/tests/test_memory.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/test_memory.py' 64s 'fsspec/implementations/tests/test_reference.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/test_reference.py' 64s 'fsspec/implementations/tests/test_sftp.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/test_sftp.py' 64s 'fsspec/implementations/tests/test_smb.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/test_smb.py' 64s 'fsspec/implementations/tests/test_tar.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/test_tar.py' 64s 'fsspec/implementations/tests/test_webhdfs.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/test_webhdfs.py' 64s 'fsspec/implementations/tests/test_zip.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/test_zip.py' 64s 'fsspec/conftest.py' -> '/tmp/autopkgtest.keP0ZR/autopkgtest_tmp/conftest.py' 64s === python3.13 === 64s /usr/lib/python3/dist-packages/pytest_asyncio/plugin.py:207: PytestDeprecationWarning: The configuration option "asyncio_default_fixture_loop_scope" is unset. 64s The event loop scope for asynchronous fixtures will default to the fixture caching scope. Future versions of pytest-asyncio will default the loop scope for asynchronous fixtures to function scope. Set the default fixture loop scope explicitly in order to avoid unexpected behavior in the future. Valid fixture loop scopes are: "function", "class", "module", "package", "session" 64s 64s warnings.warn(PytestDeprecationWarning(_DEFAULT_FIXTURE_LOOP_SCOPE_UNSET)) 65s ============================= test session starts ============================== 65s platform linux -- Python 3.13.5, pytest-8.3.5, pluggy-1.5.0 65s rootdir: /tmp/autopkgtest.keP0ZR/autopkgtest_tmp 65s plugins: vcr-1.0.2, mock-3.14.0, asyncio-0.25.1, typeguard-4.4.2 65s asyncio: mode=Mode.STRICT, asyncio_default_fixture_loop_scope=None 65s collected 790 items / 2 skipped 65s 65s tests/test_api.py ...............x...... [ 2%] 69s tests/test_async.py .........s... [ 4%] 69s tests/test_caches.py ................................................... [ 10%] 69s ........................................................................ [ 20%] 69s ....................... [ 22%] 69s tests/test_callbacks.py ........ [ 23%] 69s tests/test_compression.py ...sss [ 24%] 69s tests/test_config.py ....... [ 25%] 69s tests/test_core.py .................................................ss.. [ 32%] 69s sss.s [ 32%] 69s tests/test_file.py sssssssss.s [ 34%] 70s tests/test_generic.py ...... [ 35%] 70s tests/test_mapping.py ................. [ 37%] 70s tests/test_parquet.py ssssssssssssssssssssssssssssssssssssssssssssssssss [ 43%] 70s ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss [ 52%] 70s ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss [ 61%] 70s tests/test_registry.py ......s [ 62%] 100s tests/test_spec.py ....................x................................ [ 69%] 101s .....ssssssssss......................................................... [ 78%] 101s ........................................................................ [ 87%] 101s ................................. [ 91%] 101s tests/test_utils.py .................................................... [ 98%] 101s ............... [100%] 101s 101s =============================== warnings summary =============================== 101s tests/test_async.py::test_async_streamed_file_write 101s /usr/lib/python3.13/functools.py:77: RuntimeWarning: coroutine 'test_run_coros_in_chunks..runner' was never awaited 101s return partial(update_wrapper, wrapped=wrapped, 101s Enable tracemalloc to get traceback where the object was allocated. 101s See https://docs.pytest.org/en/stable/how-to/capture-warnings.html#resource-warnings for more info. 101s 101s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 101s =========== 565 passed, 225 skipped, 2 xfailed, 1 warning in 36.85s ============ 102s /usr/lib/python3/dist-packages/pytest_asyncio/plugin.py:207: PytestDeprecationWarning: The configuration option "asyncio_default_fixture_loop_scope" is unset. 102s The event loop scope for asynchronous fixtures will default to the fixture caching scope. Future versions of pytest-asyncio will default the loop scope for asynchronous fixtures to function scope. Set the default fixture loop scope explicitly in order to avoid unexpected behavior in the future. Valid fixture loop scopes are: "function", "class", "module", "package", "session" 102s 102s warnings.warn(PytestDeprecationWarning(_DEFAULT_FIXTURE_LOOP_SCOPE_UNSET)) 104s ============================= test session starts ============================== 104s platform linux -- Python 3.13.5, pytest-8.3.5, pluggy-1.5.0 104s rootdir: /tmp/autopkgtest.keP0ZR/autopkgtest_tmp 104s plugins: vcr-1.0.2, mock-3.14.0, asyncio-0.25.1, typeguard-4.4.2 104s asyncio: mode=Mode.STRICT, asyncio_default_fixture_loop_scope=None 104s collected 1005 items / 3 deselected / 7 skipped / 1002 selected 104s 104s implementations_tests/local/local_test.py .............................. [ 2%] 104s ........................................................................ [ 10%] 105s ................................. [ 13%] 105s implementations_tests/memory/memory_test.py ............................ [ 16%] 105s ........................................................................ [ 23%] 105s ..................................... [ 27%] 107s implementations_tests/test_archive.py .................................. [ 30%] 115s ...................................................sssssssssssssssss [ 37%] 115s implementations_tests/test_asyn_wrapper.py ......... [ 38%] 123s implementations_tests/test_cached.py ..........ssssssss......sss........ [ 41%] 124s ..........ssssssssssssssss.s........ssss..................... [ 47%] 124s implementations_tests/test_common.py ssss [ 48%] 124s implementations_tests/test_data.py .. [ 48%] 124s implementations_tests/test_dirfs.py .................................... [ 51%] 124s ........................................................................ [ 59%] 124s .......................... [ 61%] 124s implementations_tests/test_ftp.py sssssssssssssssssss [ 63%] 622s implementations_tests/test_github.py .FF.. [ 64%] 623s implementations_tests/test_http.py ..................................... [ 67%] 624s .................... [ 69%] 624s implementations_tests/test_http_sync.py ................................ [ 73%] 625s ....... [ 73%] 625s implementations_tests/test_libarchive.py s [ 73%] 625s implementations_tests/test_local.py .s........................s......... [ 77%] 625s ....................................................ss........ss.sssss.. [ 84%] 625s .....sss....s.......................... [ 88%] 626s implementations_tests/test_memory.py .............................. [ 91%] 626s implementations_tests/test_reference.py ..................s.....ss..ssss [ 94%] 626s s [ 94%] 626s implementations_tests/test_tar.py ......................... [ 97%] 626s implementations_tests/test_webhdfs.py ssssssssssss [ 98%] 626s implementations_tests/test_zip.py ............... [100%] 626s 626s =================================== FAILURES =================================== 626s _________________________ test_github_open_large_file __________________________ 626s 626s self = 626s addr_infos = [] 626s req = 626s timeout = ClientTimeout(total=300, connect=None, sock_read=None, sock_connect=30, ceil_threshold=5) 626s client_error = 626s args = (functools.partial(, loop=<_UnixSelectorEventLoop running=True closed=False debug=False>),) 626s kwargs = {'server_hostname': 'raw.githubusercontent.com', 'ssl': } 626s 626s async def _wrap_create_connection( 626s self, 626s *args: Any, 626s addr_infos: List[aiohappyeyeballs.AddrInfoType], 626s req: ClientRequest, 626s timeout: "ClientTimeout", 626s client_error: Type[Exception] = ClientConnectorError, 626s **kwargs: Any, 626s ) -> Tuple[asyncio.Transport, ResponseHandler]: 626s try: 626s async with ceil_timeout( 626s timeout.sock_connect, ceil_threshold=timeout.ceil_threshold 626s ): 626s > sock = await aiohappyeyeballs.start_connection( 626s addr_infos=addr_infos, 626s local_addr_infos=self._local_addr_infos, 626s happy_eyeballs_delay=self._happy_eyeballs_delay, 626s interleave=self._interleave, 626s loop=self._loop, 626s ) 626s 626s /usr/lib/python3/dist-packages/aiohttp/connector.py:1115: 626s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 626s /usr/lib/python3/dist-packages/aiohappyeyeballs/impl.py:87: in start_connection 626s sock, _, _ = await _staggered.staggered_race( 626s /usr/lib/python3/dist-packages/aiohappyeyeballs/_staggered.py:165: in staggered_race 626s done = await _wait_one( 626s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 626s 626s futures = {.run_one_coro() done, defined at /usr/lib/python3/dist-packages/aiohappyeyeballs/_staggered.py:115> result=None>} 626s loop = <_UnixSelectorEventLoop running=True closed=False debug=False> 626s 626s async def _wait_one( 626s futures: "Iterable[asyncio.Future[Any]]", 626s loop: asyncio.AbstractEventLoop, 626s ) -> _T: 626s """Wait for the first future to complete.""" 626s wait_next = loop.create_future() 626s 626s def _on_completion(fut: "asyncio.Future[Any]") -> None: 626s if not wait_next.done(): 626s wait_next.set_result(fut) 626s 626s for f in futures: 626s f.add_done_callback(_on_completion) 626s 626s try: 626s > return await wait_next 626s E asyncio.exceptions.CancelledError 626s 626s /usr/lib/python3/dist-packages/aiohappyeyeballs/_staggered.py:46: CancelledError 626s 626s The above exception was the direct cause of the following exception: 626s 626s self = , method = 'GET' 626s str_or_url = URL('https://raw.githubusercontent.com/mwaskom/seaborn-data/83bfba7/brain_networks.csv') 626s 626s async def _request( 626s self, 626s method: str, 626s str_or_url: StrOrURL, 626s *, 626s params: Query = None, 626s data: Any = None, 626s json: Any = None, 626s cookies: Optional[LooseCookies] = None, 626s headers: Optional[LooseHeaders] = None, 626s skip_auto_headers: Optional[Iterable[str]] = None, 626s auth: Optional[BasicAuth] = None, 626s allow_redirects: bool = True, 626s max_redirects: int = 10, 626s compress: Union[str, bool, None] = None, 626s chunked: Optional[bool] = None, 626s expect100: bool = False, 626s raise_for_status: Union[ 626s None, bool, Callable[[ClientResponse], Awaitable[None]] 626s ] = None, 626s read_until_eof: bool = True, 626s proxy: Optional[StrOrURL] = None, 626s proxy_auth: Optional[BasicAuth] = None, 626s timeout: Union[ClientTimeout, _SENTINEL] = sentinel, 626s verify_ssl: Optional[bool] = None, 626s fingerprint: Optional[bytes] = None, 626s ssl_context: Optional[SSLContext] = None, 626s ssl: Union[SSLContext, bool, Fingerprint] = True, 626s server_hostname: Optional[str] = None, 626s proxy_headers: Optional[LooseHeaders] = None, 626s trace_request_ctx: Optional[Mapping[str, Any]] = None, 626s read_bufsize: Optional[int] = None, 626s auto_decompress: Optional[bool] = None, 626s max_line_size: Optional[int] = None, 626s max_field_size: Optional[int] = None, 626s ) -> ClientResponse: 626s 626s # NOTE: timeout clamps existing connect and read timeouts. We cannot 626s # set the default to None because we need to detect if the user wants 626s # to use the existing timeouts by setting timeout to None. 626s 626s if self.closed: 626s raise RuntimeError("Session is closed") 626s 626s ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) 626s 626s if data is not None and json is not None: 626s raise ValueError( 626s "data and json parameters can not be used at the same time" 626s ) 626s elif json is not None: 626s data = payload.JsonPayload(json, dumps=self._json_serialize) 626s 626s if not isinstance(chunked, bool) and chunked is not None: 626s warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) 626s 626s redirects = 0 626s history: List[ClientResponse] = [] 626s version = self._version 626s params = params or {} 626s 626s # Merge with default headers and transform to CIMultiDict 626s headers = self._prepare_headers(headers) 626s 626s try: 626s url = self._build_url(str_or_url) 626s except ValueError as e: 626s raise InvalidUrlClientError(str_or_url) from e 626s 626s assert self._connector is not None 626s if url.scheme not in self._connector.allowed_protocol_schema_set: 626s raise NonHttpUrlClientError(url) 626s 626s skip_headers: Optional[Iterable[istr]] 626s if skip_auto_headers is not None: 626s skip_headers = { 626s istr(i) for i in skip_auto_headers 626s } | self._skip_auto_headers 626s elif self._skip_auto_headers: 626s skip_headers = self._skip_auto_headers 626s else: 626s skip_headers = None 626s 626s if proxy is None: 626s proxy = self._default_proxy 626s if proxy_auth is None: 626s proxy_auth = self._default_proxy_auth 626s 626s if proxy is None: 626s proxy_headers = None 626s else: 626s proxy_headers = self._prepare_headers(proxy_headers) 626s try: 626s proxy = URL(proxy) 626s except ValueError as e: 626s raise InvalidURL(proxy) from e 626s 626s if timeout is sentinel: 626s real_timeout: ClientTimeout = self._timeout 626s else: 626s if not isinstance(timeout, ClientTimeout): 626s real_timeout = ClientTimeout(total=timeout) 626s else: 626s real_timeout = timeout 626s # timeout is cumulative for all request operations 626s # (request, redirects, responses, data consuming) 626s tm = TimeoutHandle( 626s self._loop, real_timeout.total, ceil_threshold=real_timeout.ceil_threshold 626s ) 626s handle = tm.start() 626s 626s if read_bufsize is None: 626s read_bufsize = self._read_bufsize 626s 626s if auto_decompress is None: 626s auto_decompress = self._auto_decompress 626s 626s if max_line_size is None: 626s max_line_size = self._max_line_size 626s 626s if max_field_size is None: 626s max_field_size = self._max_field_size 626s 626s traces = [ 626s Trace( 626s self, 626s trace_config, 626s trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), 626s ) 626s for trace_config in self._trace_configs 626s ] 626s 626s for trace in traces: 626s await trace.send_request_start(method, url.update_query(params), headers) 626s 626s timer = tm.timer() 626s try: 626s with timer: 626s # https://www.rfc-editor.org/rfc/rfc9112.html#name-retrying-requests 626s retry_persistent_connection = ( 626s self._retry_connection and method in IDEMPOTENT_METHODS 626s ) 626s while True: 626s url, auth_from_url = strip_auth_from_url(url) 626s if not url.raw_host: 626s # NOTE: Bail early, otherwise, causes `InvalidURL` through 626s # NOTE: `self._request_class()` below. 626s err_exc_cls = ( 626s InvalidUrlRedirectClientError 626s if redirects 626s else InvalidUrlClientError 626s ) 626s raise err_exc_cls(url) 626s # If `auth` was passed for an already authenticated URL, 626s # disallow only if this is the initial URL; this is to avoid issues 626s # with sketchy redirects that are not the caller's responsibility 626s if not history and (auth and auth_from_url): 626s raise ValueError( 626s "Cannot combine AUTH argument with " 626s "credentials encoded in URL" 626s ) 626s 626s # Override the auth with the one from the URL only if we 626s # have no auth, or if we got an auth from a redirect URL 626s if auth is None or (history and auth_from_url is not None): 626s auth = auth_from_url 626s 626s if ( 626s auth is None 626s and self._default_auth 626s and ( 626s not self._base_url or self._base_url_origin == url.origin() 626s ) 626s ): 626s auth = self._default_auth 626s # It would be confusing if we support explicit 626s # Authorization header with auth argument 626s if ( 626s headers is not None 626s and auth is not None 626s and hdrs.AUTHORIZATION in headers 626s ): 626s raise ValueError( 626s "Cannot combine AUTHORIZATION header " 626s "with AUTH argument or credentials " 626s "encoded in URL" 626s ) 626s 626s all_cookies = self._cookie_jar.filter_cookies(url) 626s 626s if cookies is not None: 626s tmp_cookie_jar = CookieJar( 626s quote_cookie=self._cookie_jar.quote_cookie 626s ) 626s tmp_cookie_jar.update_cookies(cookies) 626s req_cookies = tmp_cookie_jar.filter_cookies(url) 626s if req_cookies: 626s all_cookies.load(req_cookies) 626s 626s if proxy is not None: 626s proxy = URL(proxy) 626s elif self._trust_env: 626s with suppress(LookupError): 626s proxy, proxy_auth = get_env_proxy_for_url(url) 626s 626s req = self._request_class( 626s method, 626s url, 626s params=params, 626s headers=headers, 626s skip_auto_headers=skip_headers, 626s data=data, 626s cookies=all_cookies, 626s auth=auth, 626s version=version, 626s compress=compress, 626s chunked=chunked, 626s expect100=expect100, 626s loop=self._loop, 626s response_class=self._response_class, 626s proxy=proxy, 626s proxy_auth=proxy_auth, 626s timer=timer, 626s session=self, 626s ssl=ssl if ssl is not None else True, 626s server_hostname=server_hostname, 626s proxy_headers=proxy_headers, 626s traces=traces, 626s trust_env=self.trust_env, 626s ) 626s 626s # connection timeout 626s try: 626s > conn = await self._connector.connect( 626s req, traces=traces, timeout=real_timeout 626s ) 626s 626s /usr/lib/python3/dist-packages/aiohttp/client.py:703: 626s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 626s /usr/lib/python3/dist-packages/aiohttp/connector.py:548: in connect 626s proto = await self._create_connection(req, traces, timeout) 626s /usr/lib/python3/dist-packages/aiohttp/connector.py:1056: in _create_connection 626s _, proto = await self._create_direct_connection(req, traces, timeout) 626s /usr/lib/python3/dist-packages/aiohttp/connector.py:1400: in _create_direct_connection 626s raise last_exc 626s /usr/lib/python3/dist-packages/aiohttp/connector.py:1369: in _create_direct_connection 626s transp, proto = await self._wrap_create_connection( 626s /usr/lib/python3/dist-packages/aiohttp/connector.py:1112: in _wrap_create_connection 626s async with ceil_timeout( 626s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 626s 626s self = 626s exc_type = 626s exc_val = CancelledError(), exc_tb = 626s 626s async def __aexit__( 626s self, 626s exc_type: Optional[Type[BaseException]], 626s exc_val: Optional[BaseException], 626s exc_tb: Optional[TracebackType], 626s ) -> Optional[bool]: 626s assert self._state in (_State.ENTERED, _State.EXPIRING) 626s 626s if self._timeout_handler is not None: 626s self._timeout_handler.cancel() 626s self._timeout_handler = None 626s 626s if self._state is _State.EXPIRING: 626s self._state = _State.EXPIRED 626s 626s if self._task.uncancel() <= self._cancelling and exc_type is not None: 626s # Since there are no new cancel requests, we're 626s # handling this. 626s if issubclass(exc_type, exceptions.CancelledError): 626s > raise TimeoutError from exc_val 626s E TimeoutError 626s 626s /usr/lib/python3.13/asyncio/timeouts.py:116: TimeoutError 626s 626s The above exception was the direct cause of the following exception: 626s 626s self = 626s url = 'https://raw.githubusercontent.com/mwaskom/seaborn-data/83bfba7/brain_networks.csv' 626s kwargs = {}, info = {} 626s session = 626s policy = 'get' 626s 626s async def _info(self, url, **kwargs): 626s """Get info of URL 626s 626s Tries to access location via HEAD, and then GET methods, but does 626s not fetch the data. 626s 626s It is possible that the server does not supply any size information, in 626s which case size will be given as None (and certain operations on the 626s corresponding file will not work). 626s """ 626s info = {} 626s session = await self.set_session() 626s 626s for policy in ["head", "get"]: 626s try: 626s info.update( 626s > await _file_info( 626s self.encode_url(url), 626s size_policy=policy, 626s session=session, 626s **self.kwargs, 626s **kwargs, 626s ) 626s ) 626s 626s /usr/lib/python3/dist-packages/fsspec/implementations/http.py:427: 626s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 626s /usr/lib/python3/dist-packages/fsspec/implementations/http.py:839: in _file_info 626s r = await session.get(url, allow_redirects=ar, **kwargs) 626s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 626s 626s self = , method = 'GET' 626s str_or_url = URL('https://raw.githubusercontent.com/mwaskom/seaborn-data/83bfba7/brain_networks.csv') 626s 626s async def _request( 626s self, 626s method: str, 626s str_or_url: StrOrURL, 626s *, 626s params: Query = None, 626s data: Any = None, 626s json: Any = None, 626s cookies: Optional[LooseCookies] = None, 626s headers: Optional[LooseHeaders] = None, 626s skip_auto_headers: Optional[Iterable[str]] = None, 626s auth: Optional[BasicAuth] = None, 626s allow_redirects: bool = True, 626s max_redirects: int = 10, 626s compress: Union[str, bool, None] = None, 626s chunked: Optional[bool] = None, 626s expect100: bool = False, 626s raise_for_status: Union[ 626s None, bool, Callable[[ClientResponse], Awaitable[None]] 626s ] = None, 626s read_until_eof: bool = True, 626s proxy: Optional[StrOrURL] = None, 626s proxy_auth: Optional[BasicAuth] = None, 626s timeout: Union[ClientTimeout, _SENTINEL] = sentinel, 626s verify_ssl: Optional[bool] = None, 626s fingerprint: Optional[bytes] = None, 626s ssl_context: Optional[SSLContext] = None, 626s ssl: Union[SSLContext, bool, Fingerprint] = True, 626s server_hostname: Optional[str] = None, 626s proxy_headers: Optional[LooseHeaders] = None, 626s trace_request_ctx: Optional[Mapping[str, Any]] = None, 626s read_bufsize: Optional[int] = None, 626s auto_decompress: Optional[bool] = None, 626s max_line_size: Optional[int] = None, 626s max_field_size: Optional[int] = None, 626s ) -> ClientResponse: 626s 626s # NOTE: timeout clamps existing connect and read timeouts. We cannot 626s # set the default to None because we need to detect if the user wants 626s # to use the existing timeouts by setting timeout to None. 626s 626s if self.closed: 626s raise RuntimeError("Session is closed") 626s 626s ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) 626s 626s if data is not None and json is not None: 626s raise ValueError( 626s "data and json parameters can not be used at the same time" 626s ) 626s elif json is not None: 626s data = payload.JsonPayload(json, dumps=self._json_serialize) 626s 626s if not isinstance(chunked, bool) and chunked is not None: 626s warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) 626s 626s redirects = 0 626s history: List[ClientResponse] = [] 626s version = self._version 626s params = params or {} 626s 626s # Merge with default headers and transform to CIMultiDict 626s headers = self._prepare_headers(headers) 626s 626s try: 626s url = self._build_url(str_or_url) 626s except ValueError as e: 626s raise InvalidUrlClientError(str_or_url) from e 626s 626s assert self._connector is not None 626s if url.scheme not in self._connector.allowed_protocol_schema_set: 626s raise NonHttpUrlClientError(url) 626s 626s skip_headers: Optional[Iterable[istr]] 626s if skip_auto_headers is not None: 626s skip_headers = { 626s istr(i) for i in skip_auto_headers 626s } | self._skip_auto_headers 626s elif self._skip_auto_headers: 626s skip_headers = self._skip_auto_headers 626s else: 626s skip_headers = None 626s 626s if proxy is None: 626s proxy = self._default_proxy 626s if proxy_auth is None: 626s proxy_auth = self._default_proxy_auth 626s 626s if proxy is None: 626s proxy_headers = None 626s else: 626s proxy_headers = self._prepare_headers(proxy_headers) 626s try: 626s proxy = URL(proxy) 626s except ValueError as e: 626s raise InvalidURL(proxy) from e 626s 626s if timeout is sentinel: 626s real_timeout: ClientTimeout = self._timeout 626s else: 626s if not isinstance(timeout, ClientTimeout): 626s real_timeout = ClientTimeout(total=timeout) 626s else: 626s real_timeout = timeout 626s # timeout is cumulative for all request operations 626s # (request, redirects, responses, data consuming) 626s tm = TimeoutHandle( 626s self._loop, real_timeout.total, ceil_threshold=real_timeout.ceil_threshold 626s ) 626s handle = tm.start() 626s 626s if read_bufsize is None: 626s read_bufsize = self._read_bufsize 626s 626s if auto_decompress is None: 626s auto_decompress = self._auto_decompress 626s 626s if max_line_size is None: 626s max_line_size = self._max_line_size 626s 626s if max_field_size is None: 626s max_field_size = self._max_field_size 626s 626s traces = [ 626s Trace( 626s self, 626s trace_config, 626s trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), 626s ) 626s for trace_config in self._trace_configs 626s ] 626s 626s for trace in traces: 626s await trace.send_request_start(method, url.update_query(params), headers) 626s 626s timer = tm.timer() 626s try: 626s with timer: 626s # https://www.rfc-editor.org/rfc/rfc9112.html#name-retrying-requests 626s retry_persistent_connection = ( 626s self._retry_connection and method in IDEMPOTENT_METHODS 626s ) 626s while True: 626s url, auth_from_url = strip_auth_from_url(url) 626s if not url.raw_host: 626s # NOTE: Bail early, otherwise, causes `InvalidURL` through 626s # NOTE: `self._request_class()` below. 626s err_exc_cls = ( 626s InvalidUrlRedirectClientError 626s if redirects 626s else InvalidUrlClientError 626s ) 626s raise err_exc_cls(url) 626s # If `auth` was passed for an already authenticated URL, 626s # disallow only if this is the initial URL; this is to avoid issues 626s # with sketchy redirects that are not the caller's responsibility 626s if not history and (auth and auth_from_url): 626s raise ValueError( 626s "Cannot combine AUTH argument with " 626s "credentials encoded in URL" 626s ) 626s 626s # Override the auth with the one from the URL only if we 626s # have no auth, or if we got an auth from a redirect URL 626s if auth is None or (history and auth_from_url is not None): 626s auth = auth_from_url 626s 626s if ( 626s auth is None 626s and self._default_auth 626s and ( 626s not self._base_url or self._base_url_origin == url.origin() 626s ) 626s ): 626s auth = self._default_auth 626s # It would be confusing if we support explicit 626s # Authorization header with auth argument 626s if ( 626s headers is not None 626s and auth is not None 626s and hdrs.AUTHORIZATION in headers 626s ): 626s raise ValueError( 626s "Cannot combine AUTHORIZATION header " 626s "with AUTH argument or credentials " 626s "encoded in URL" 626s ) 626s 626s all_cookies = self._cookie_jar.filter_cookies(url) 626s 626s if cookies is not None: 626s tmp_cookie_jar = CookieJar( 626s quote_cookie=self._cookie_jar.quote_cookie 626s ) 626s tmp_cookie_jar.update_cookies(cookies) 626s req_cookies = tmp_cookie_jar.filter_cookies(url) 626s if req_cookies: 626s all_cookies.load(req_cookies) 626s 626s if proxy is not None: 626s proxy = URL(proxy) 626s elif self._trust_env: 626s with suppress(LookupError): 626s proxy, proxy_auth = get_env_proxy_for_url(url) 626s 626s req = self._request_class( 626s method, 626s url, 626s params=params, 626s headers=headers, 626s skip_auto_headers=skip_headers, 626s data=data, 626s cookies=all_cookies, 626s auth=auth, 626s version=version, 626s compress=compress, 626s chunked=chunked, 626s expect100=expect100, 626s loop=self._loop, 626s response_class=self._response_class, 626s proxy=proxy, 626s proxy_auth=proxy_auth, 626s timer=timer, 626s session=self, 626s ssl=ssl if ssl is not None else True, 626s server_hostname=server_hostname, 626s proxy_headers=proxy_headers, 626s traces=traces, 626s trust_env=self.trust_env, 626s ) 626s 626s # connection timeout 626s try: 626s conn = await self._connector.connect( 626s req, traces=traces, timeout=real_timeout 626s ) 626s except asyncio.TimeoutError as exc: 626s > raise ConnectionTimeoutError( 626s f"Connection timeout to host {url}" 626s ) from exc 626s E aiohttp.client_exceptions.ConnectionTimeoutError: Connection timeout to host https://raw.githubusercontent.com/mwaskom/seaborn-data/83bfba7/brain_networks.csv 626s 626s /usr/lib/python3/dist-packages/aiohttp/client.py:707: ConnectionTimeoutError 626s 626s The above exception was the direct cause of the following exception: 626s 626s def test_github_open_large_file(): 626s # test opening a large file >1 MB 626s # use block_size=0 to get a streaming interface to the file, ensuring that 626s # we fetch only the parts we need instead of downloading the full file all 626s # at once 626s > with fsspec.open( 626s "github://mwaskom:seaborn-data@83bfba7/brain_networks.csv", block_size=0 626s ) as f: 626s 626s /tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/test_github.py:15: 626s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 626s /usr/lib/python3/dist-packages/fsspec/core.py:105: in __enter__ 626s f = self.fs.open(self.path, mode=mode) 626s /usr/lib/python3/dist-packages/fsspec/spec.py:1310: in open 626s f = self._open( 626s /usr/lib/python3/dist-packages/fsspec/implementations/github.py:261: in _open 626s return self.http_fs.open( 626s /usr/lib/python3/dist-packages/fsspec/spec.py:1310: in open 626s f = self._open( 626s /usr/lib/python3/dist-packages/fsspec/implementations/http.py:366: in _open 626s size = size or info.update(self.info(path, **kwargs)) or info["size"] 626s /usr/lib/python3/dist-packages/fsspec/asyn.py:118: in wrapper 626s return sync(self.loop, func, *args, **kwargs) 626s /usr/lib/python3/dist-packages/fsspec/asyn.py:103: in sync 626s raise return_result 626s /usr/lib/python3/dist-packages/fsspec/asyn.py:56: in _runner 626s result[0] = await coro 626s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 626s 626s self = 626s url = 'https://raw.githubusercontent.com/mwaskom/seaborn-data/83bfba7/brain_networks.csv' 626s kwargs = {}, info = {} 626s session = 626s policy = 'get' 626s 626s async def _info(self, url, **kwargs): 626s """Get info of URL 626s 626s Tries to access location via HEAD, and then GET methods, but does 626s not fetch the data. 626s 626s It is possible that the server does not supply any size information, in 626s which case size will be given as None (and certain operations on the 626s corresponding file will not work). 626s """ 626s info = {} 626s session = await self.set_session() 626s 626s for policy in ["head", "get"]: 626s try: 626s info.update( 626s await _file_info( 626s self.encode_url(url), 626s size_policy=policy, 626s session=session, 626s **self.kwargs, 626s **kwargs, 626s ) 626s ) 626s if info.get("size") is not None: 626s break 626s except Exception as exc: 626s if policy == "get": 626s # If get failed, then raise a FileNotFoundError 626s > raise FileNotFoundError(url) from exc 626s E FileNotFoundError: https://raw.githubusercontent.com/mwaskom/seaborn-data/83bfba7/brain_networks.csv 626s 626s /usr/lib/python3/dist-packages/fsspec/implementations/http.py:440: FileNotFoundError 626s __________________________ test_github_open_lfs_file ___________________________ 626s 626s self = 626s addr_infos = [] 626s req = 626s timeout = ClientTimeout(total=300, connect=None, sock_read=None, sock_connect=30, ceil_threshold=5) 626s client_error = 626s args = (functools.partial(, loop=<_UnixSelectorEventLoop running=True closed=False debug=False>),) 626s kwargs = {'server_hostname': 'media.githubusercontent.com', 'ssl': } 626s 626s async def _wrap_create_connection( 626s self, 626s *args: Any, 626s addr_infos: List[aiohappyeyeballs.AddrInfoType], 626s req: ClientRequest, 626s timeout: "ClientTimeout", 626s client_error: Type[Exception] = ClientConnectorError, 626s **kwargs: Any, 626s ) -> Tuple[asyncio.Transport, ResponseHandler]: 626s try: 626s async with ceil_timeout( 626s timeout.sock_connect, ceil_threshold=timeout.ceil_threshold 626s ): 626s > sock = await aiohappyeyeballs.start_connection( 626s addr_infos=addr_infos, 626s local_addr_infos=self._local_addr_infos, 626s happy_eyeballs_delay=self._happy_eyeballs_delay, 626s interleave=self._interleave, 626s loop=self._loop, 626s ) 626s 626s /usr/lib/python3/dist-packages/aiohttp/connector.py:1115: 626s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 626s /usr/lib/python3/dist-packages/aiohappyeyeballs/impl.py:87: in start_connection 626s sock, _, _ = await _staggered.staggered_race( 626s /usr/lib/python3/dist-packages/aiohappyeyeballs/_staggered.py:165: in staggered_race 626s done = await _wait_one( 626s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 626s 626s futures = {.run_one_coro() done, defined at /usr/lib/python3/dist-packages/aiohappyeyeballs/_staggered.py:115> result=None>} 626s loop = <_UnixSelectorEventLoop running=True closed=False debug=False> 626s 626s async def _wait_one( 626s futures: "Iterable[asyncio.Future[Any]]", 626s loop: asyncio.AbstractEventLoop, 626s ) -> _T: 626s """Wait for the first future to complete.""" 626s wait_next = loop.create_future() 626s 626s def _on_completion(fut: "asyncio.Future[Any]") -> None: 626s if not wait_next.done(): 626s wait_next.set_result(fut) 626s 626s for f in futures: 626s f.add_done_callback(_on_completion) 626s 626s try: 626s > return await wait_next 626s E asyncio.exceptions.CancelledError 626s 626s /usr/lib/python3/dist-packages/aiohappyeyeballs/_staggered.py:46: CancelledError 626s 626s The above exception was the direct cause of the following exception: 626s 626s self = , method = 'GET' 626s str_or_url = URL('https://media.githubusercontent.com/media/cBioPortal/datahub/55cd360/public/acc_2019/data_gene_panel_matrix.txt') 626s 626s async def _request( 626s self, 626s method: str, 626s str_or_url: StrOrURL, 626s *, 626s params: Query = None, 626s data: Any = None, 626s json: Any = None, 626s cookies: Optional[LooseCookies] = None, 626s headers: Optional[LooseHeaders] = None, 626s skip_auto_headers: Optional[Iterable[str]] = None, 626s auth: Optional[BasicAuth] = None, 626s allow_redirects: bool = True, 626s max_redirects: int = 10, 626s compress: Union[str, bool, None] = None, 626s chunked: Optional[bool] = None, 626s expect100: bool = False, 626s raise_for_status: Union[ 626s None, bool, Callable[[ClientResponse], Awaitable[None]] 626s ] = None, 626s read_until_eof: bool = True, 626s proxy: Optional[StrOrURL] = None, 626s proxy_auth: Optional[BasicAuth] = None, 626s timeout: Union[ClientTimeout, _SENTINEL] = sentinel, 626s verify_ssl: Optional[bool] = None, 626s fingerprint: Optional[bytes] = None, 626s ssl_context: Optional[SSLContext] = None, 626s ssl: Union[SSLContext, bool, Fingerprint] = True, 626s server_hostname: Optional[str] = None, 626s proxy_headers: Optional[LooseHeaders] = None, 626s trace_request_ctx: Optional[Mapping[str, Any]] = None, 626s read_bufsize: Optional[int] = None, 626s auto_decompress: Optional[bool] = None, 626s max_line_size: Optional[int] = None, 626s max_field_size: Optional[int] = None, 626s ) -> ClientResponse: 626s 626s # NOTE: timeout clamps existing connect and read timeouts. We cannot 626s # set the default to None because we need to detect if the user wants 626s # to use the existing timeouts by setting timeout to None. 626s 626s if self.closed: 626s raise RuntimeError("Session is closed") 626s 626s ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) 626s 626s if data is not None and json is not None: 626s raise ValueError( 626s "data and json parameters can not be used at the same time" 626s ) 626s elif json is not None: 626s data = payload.JsonPayload(json, dumps=self._json_serialize) 626s 626s if not isinstance(chunked, bool) and chunked is not None: 626s warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) 626s 626s redirects = 0 626s history: List[ClientResponse] = [] 626s version = self._version 626s params = params or {} 626s 626s # Merge with default headers and transform to CIMultiDict 626s headers = self._prepare_headers(headers) 626s 626s try: 626s url = self._build_url(str_or_url) 626s except ValueError as e: 626s raise InvalidUrlClientError(str_or_url) from e 626s 626s assert self._connector is not None 626s if url.scheme not in self._connector.allowed_protocol_schema_set: 626s raise NonHttpUrlClientError(url) 626s 626s skip_headers: Optional[Iterable[istr]] 626s if skip_auto_headers is not None: 626s skip_headers = { 626s istr(i) for i in skip_auto_headers 626s } | self._skip_auto_headers 626s elif self._skip_auto_headers: 626s skip_headers = self._skip_auto_headers 626s else: 626s skip_headers = None 626s 626s if proxy is None: 626s proxy = self._default_proxy 626s if proxy_auth is None: 626s proxy_auth = self._default_proxy_auth 626s 626s if proxy is None: 626s proxy_headers = None 626s else: 626s proxy_headers = self._prepare_headers(proxy_headers) 626s try: 626s proxy = URL(proxy) 626s except ValueError as e: 626s raise InvalidURL(proxy) from e 626s 626s if timeout is sentinel: 626s real_timeout: ClientTimeout = self._timeout 626s else: 626s if not isinstance(timeout, ClientTimeout): 626s real_timeout = ClientTimeout(total=timeout) 626s else: 626s real_timeout = timeout 626s # timeout is cumulative for all request operations 626s # (request, redirects, responses, data consuming) 626s tm = TimeoutHandle( 626s self._loop, real_timeout.total, ceil_threshold=real_timeout.ceil_threshold 626s ) 626s handle = tm.start() 626s 626s if read_bufsize is None: 626s read_bufsize = self._read_bufsize 626s 626s if auto_decompress is None: 626s auto_decompress = self._auto_decompress 626s 626s if max_line_size is None: 626s max_line_size = self._max_line_size 626s 626s if max_field_size is None: 626s max_field_size = self._max_field_size 626s 626s traces = [ 626s Trace( 626s self, 626s trace_config, 626s trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), 626s ) 626s for trace_config in self._trace_configs 626s ] 626s 626s for trace in traces: 626s await trace.send_request_start(method, url.update_query(params), headers) 626s 626s timer = tm.timer() 626s try: 626s with timer: 626s # https://www.rfc-editor.org/rfc/rfc9112.html#name-retrying-requests 626s retry_persistent_connection = ( 626s self._retry_connection and method in IDEMPOTENT_METHODS 626s ) 626s while True: 626s url, auth_from_url = strip_auth_from_url(url) 626s if not url.raw_host: 626s # NOTE: Bail early, otherwise, causes `InvalidURL` through 626s # NOTE: `self._request_class()` below. 626s err_exc_cls = ( 626s InvalidUrlRedirectClientError 626s if redirects 626s else InvalidUrlClientError 626s ) 626s raise err_exc_cls(url) 626s # If `auth` was passed for an already authenticated URL, 626s # disallow only if this is the initial URL; this is to avoid issues 626s # with sketchy redirects that are not the caller's responsibility 626s if not history and (auth and auth_from_url): 626s raise ValueError( 626s "Cannot combine AUTH argument with " 626s "credentials encoded in URL" 626s ) 626s 626s # Override the auth with the one from the URL only if we 626s # have no auth, or if we got an auth from a redirect URL 626s if auth is None or (history and auth_from_url is not None): 626s auth = auth_from_url 626s 626s if ( 626s auth is None 626s and self._default_auth 626s and ( 626s not self._base_url or self._base_url_origin == url.origin() 626s ) 626s ): 626s auth = self._default_auth 626s # It would be confusing if we support explicit 626s # Authorization header with auth argument 626s if ( 626s headers is not None 626s and auth is not None 626s and hdrs.AUTHORIZATION in headers 626s ): 626s raise ValueError( 626s "Cannot combine AUTHORIZATION header " 626s "with AUTH argument or credentials " 626s "encoded in URL" 626s ) 626s 626s all_cookies = self._cookie_jar.filter_cookies(url) 626s 626s if cookies is not None: 626s tmp_cookie_jar = CookieJar( 626s quote_cookie=self._cookie_jar.quote_cookie 626s ) 626s tmp_cookie_jar.update_cookies(cookies) 626s req_cookies = tmp_cookie_jar.filter_cookies(url) 626s if req_cookies: 626s all_cookies.load(req_cookies) 626s 626s if proxy is not None: 626s proxy = URL(proxy) 626s elif self._trust_env: 626s with suppress(LookupError): 626s proxy, proxy_auth = get_env_proxy_for_url(url) 626s 626s req = self._request_class( 626s method, 626s url, 626s params=params, 626s headers=headers, 626s skip_auto_headers=skip_headers, 626s data=data, 626s cookies=all_cookies, 626s auth=auth, 626s version=version, 626s compress=compress, 626s chunked=chunked, 626s expect100=expect100, 626s loop=self._loop, 626s response_class=self._response_class, 626s proxy=proxy, 626s proxy_auth=proxy_auth, 626s timer=timer, 626s session=self, 626s ssl=ssl if ssl is not None else True, 626s server_hostname=server_hostname, 626s proxy_headers=proxy_headers, 626s traces=traces, 626s trust_env=self.trust_env, 626s ) 626s 626s # connection timeout 626s try: 626s > conn = await self._connector.connect( 626s req, traces=traces, timeout=real_timeout 626s ) 626s 626s /usr/lib/python3/dist-packages/aiohttp/client.py:703: 626s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 626s /usr/lib/python3/dist-packages/aiohttp/connector.py:548: in connect 626s proto = await self._create_connection(req, traces, timeout) 626s /usr/lib/python3/dist-packages/aiohttp/connector.py:1056: in _create_connection 626s _, proto = await self._create_direct_connection(req, traces, timeout) 626s /usr/lib/python3/dist-packages/aiohttp/connector.py:1400: in _create_direct_connection 626s raise last_exc 626s /usr/lib/python3/dist-packages/aiohttp/connector.py:1369: in _create_direct_connection 626s transp, proto = await self._wrap_create_connection( 626s /usr/lib/python3/dist-packages/aiohttp/connector.py:1112: in _wrap_create_connection 626s async with ceil_timeout( 626s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 626s 626s self = 626s exc_type = 626s exc_val = CancelledError(), exc_tb = 626s 626s async def __aexit__( 626s self, 626s exc_type: Optional[Type[BaseException]], 626s exc_val: Optional[BaseException], 626s exc_tb: Optional[TracebackType], 626s ) -> Optional[bool]: 626s assert self._state in (_State.ENTERED, _State.EXPIRING) 626s 626s if self._timeout_handler is not None: 626s self._timeout_handler.cancel() 626s self._timeout_handler = None 626s 626s if self._state is _State.EXPIRING: 626s self._state = _State.EXPIRED 626s 626s if self._task.uncancel() <= self._cancelling and exc_type is not None: 626s # Since there are no new cancel requests, we're 626s # handling this. 626s if issubclass(exc_type, exceptions.CancelledError): 626s > raise TimeoutError from exc_val 626s E TimeoutError 626s 626s /usr/lib/python3.13/asyncio/timeouts.py:116: TimeoutError 626s 626s The above exception was the direct cause of the following exception: 626s 626s self = 626s url = 'https://media.githubusercontent.com/media/cBioPortal/datahub/55cd360/public/acc_2019/data_gene_panel_matrix.txt' 626s kwargs = {}, info = {} 626s session = 626s policy = 'get' 626s 626s async def _info(self, url, **kwargs): 626s """Get info of URL 626s 626s Tries to access location via HEAD, and then GET methods, but does 626s not fetch the data. 626s 626s It is possible that the server does not supply any size information, in 626s which case size will be given as None (and certain operations on the 626s corresponding file will not work). 626s """ 626s info = {} 626s session = await self.set_session() 626s 626s for policy in ["head", "get"]: 626s try: 626s info.update( 626s > await _file_info( 626s self.encode_url(url), 626s size_policy=policy, 626s session=session, 626s **self.kwargs, 626s **kwargs, 626s ) 626s ) 626s 626s /usr/lib/python3/dist-packages/fsspec/implementations/http.py:427: 626s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 626s /usr/lib/python3/dist-packages/fsspec/implementations/http.py:839: in _file_info 626s r = await session.get(url, allow_redirects=ar, **kwargs) 626s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 626s 626s self = , method = 'GET' 626s str_or_url = URL('https://media.githubusercontent.com/media/cBioPortal/datahub/55cd360/public/acc_2019/data_gene_panel_matrix.txt') 626s 626s async def _request( 626s self, 626s method: str, 626s str_or_url: StrOrURL, 626s *, 626s params: Query = None, 626s data: Any = None, 626s json: Any = None, 626s cookies: Optional[LooseCookies] = None, 626s headers: Optional[LooseHeaders] = None, 626s skip_auto_headers: Optional[Iterable[str]] = None, 626s auth: Optional[BasicAuth] = None, 626s allow_redirects: bool = True, 626s max_redirects: int = 10, 626s compress: Union[str, bool, None] = None, 626s chunked: Optional[bool] = None, 626s expect100: bool = False, 626s raise_for_status: Union[ 626s None, bool, Callable[[ClientResponse], Awaitable[None]] 626s ] = None, 626s read_until_eof: bool = True, 626s proxy: Optional[StrOrURL] = None, 626s proxy_auth: Optional[BasicAuth] = None, 626s timeout: Union[ClientTimeout, _SENTINEL] = sentinel, 626s verify_ssl: Optional[bool] = None, 626s fingerprint: Optional[bytes] = None, 626s ssl_context: Optional[SSLContext] = None, 626s ssl: Union[SSLContext, bool, Fingerprint] = True, 626s server_hostname: Optional[str] = None, 626s proxy_headers: Optional[LooseHeaders] = None, 626s trace_request_ctx: Optional[Mapping[str, Any]] = None, 626s read_bufsize: Optional[int] = None, 626s auto_decompress: Optional[bool] = None, 626s max_line_size: Optional[int] = None, 626s max_field_size: Optional[int] = None, 626s ) -> ClientResponse: 626s 626s # NOTE: timeout clamps existing connect and read timeouts. We cannot 626s # set the default to None because we need to detect if the user wants 626s # to use the existing timeouts by setting timeout to None. 626s 626s if self.closed: 626s raise RuntimeError("Session is closed") 626s 626s ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) 626s 626s if data is not None and json is not None: 626s raise ValueError( 626s "data and json parameters can not be used at the same time" 626s ) 626s elif json is not None: 626s data = payload.JsonPayload(json, dumps=self._json_serialize) 626s 626s if not isinstance(chunked, bool) and chunked is not None: 626s warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) 626s 626s redirects = 0 626s history: List[ClientResponse] = [] 626s version = self._version 626s params = params or {} 626s 626s # Merge with default headers and transform to CIMultiDict 626s headers = self._prepare_headers(headers) 626s 626s try: 626s url = self._build_url(str_or_url) 626s except ValueError as e: 626s raise InvalidUrlClientError(str_or_url) from e 626s 626s assert self._connector is not None 626s if url.scheme not in self._connector.allowed_protocol_schema_set: 626s raise NonHttpUrlClientError(url) 626s 626s skip_headers: Optional[Iterable[istr]] 626s if skip_auto_headers is not None: 626s skip_headers = { 626s istr(i) for i in skip_auto_headers 626s } | self._skip_auto_headers 626s elif self._skip_auto_headers: 626s skip_headers = self._skip_auto_headers 626s else: 626s skip_headers = None 626s 626s if proxy is None: 626s proxy = self._default_proxy 626s if proxy_auth is None: 626s proxy_auth = self._default_proxy_auth 626s 626s if proxy is None: 626s proxy_headers = None 626s else: 626s proxy_headers = self._prepare_headers(proxy_headers) 626s try: 626s proxy = URL(proxy) 626s except ValueError as e: 626s raise InvalidURL(proxy) from e 626s 626s if timeout is sentinel: 626s real_timeout: ClientTimeout = self._timeout 626s else: 626s if not isinstance(timeout, ClientTimeout): 626s real_timeout = ClientTimeout(total=timeout) 626s else: 626s real_timeout = timeout 626s # timeout is cumulative for all request operations 626s # (request, redirects, responses, data consuming) 626s tm = TimeoutHandle( 626s self._loop, real_timeout.total, ceil_threshold=real_timeout.ceil_threshold 626s ) 626s handle = tm.start() 626s 626s if read_bufsize is None: 626s read_bufsize = self._read_bufsize 626s 626s if auto_decompress is None: 626s auto_decompress = self._auto_decompress 626s 626s if max_line_size is None: 626s max_line_size = self._max_line_size 626s 626s if max_field_size is None: 626s max_field_size = self._max_field_size 626s 626s traces = [ 626s Trace( 626s self, 626s trace_config, 626s trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), 626s ) 626s for trace_config in self._trace_configs 626s ] 626s 626s for trace in traces: 626s await trace.send_request_start(method, url.update_query(params), headers) 626s 626s timer = tm.timer() 626s try: 626s with timer: 626s # https://www.rfc-editor.org/rfc/rfc9112.html#name-retrying-requests 626s retry_persistent_connection = ( 626s self._retry_connection and method in IDEMPOTENT_METHODS 626s ) 626s while True: 626s url, auth_from_url = strip_auth_from_url(url) 626s if not url.raw_host: 626s # NOTE: Bail early, otherwise, causes `InvalidURL` through 626s # NOTE: `self._request_class()` below. 626s err_exc_cls = ( 626s InvalidUrlRedirectClientError 626s if redirects 626s else InvalidUrlClientError 626s ) 626s raise err_exc_cls(url) 626s # If `auth` was passed for an already authenticated URL, 626s # disallow only if this is the initial URL; this is to avoid issues 626s # with sketchy redirects that are not the caller's responsibility 626s if not history and (auth and auth_from_url): 626s raise ValueError( 626s "Cannot combine AUTH argument with " 626s "credentials encoded in URL" 626s ) 626s 626s # Override the auth with the one from the URL only if we 626s # have no auth, or if we got an auth from a redirect URL 626s if auth is None or (history and auth_from_url is not None): 626s auth = auth_from_url 626s 626s if ( 626s auth is None 626s and self._default_auth 626s and ( 626s not self._base_url or self._base_url_origin == url.origin() 626s ) 626s ): 626s auth = self._default_auth 626s # It would be confusing if we support explicit 626s # Authorization header with auth argument 626s if ( 626s headers is not None 626s and auth is not None 626s and hdrs.AUTHORIZATION in headers 626s ): 626s raise ValueError( 626s "Cannot combine AUTHORIZATION header " 626s "with AUTH argument or credentials " 626s "encoded in URL" 626s ) 626s 626s all_cookies = self._cookie_jar.filter_cookies(url) 626s 626s if cookies is not None: 626s tmp_cookie_jar = CookieJar( 626s quote_cookie=self._cookie_jar.quote_cookie 626s ) 626s tmp_cookie_jar.update_cookies(cookies) 626s req_cookies = tmp_cookie_jar.filter_cookies(url) 626s if req_cookies: 626s all_cookies.load(req_cookies) 626s 626s if proxy is not None: 626s proxy = URL(proxy) 626s elif self._trust_env: 626s with suppress(LookupError): 626s proxy, proxy_auth = get_env_proxy_for_url(url) 626s 626s req = self._request_class( 626s method, 626s url, 626s params=params, 626s headers=headers, 626s skip_auto_headers=skip_headers, 626s data=data, 626s cookies=all_cookies, 626s auth=auth, 626s version=version, 626s compress=compress, 626s chunked=chunked, 626s expect100=expect100, 626s loop=self._loop, 626s response_class=self._response_class, 626s proxy=proxy, 626s proxy_auth=proxy_auth, 626s timer=timer, 626s session=self, 626s ssl=ssl if ssl is not None else True, 626s server_hostname=server_hostname, 626s proxy_headers=proxy_headers, 626s traces=traces, 626s trust_env=self.trust_env, 626s ) 626s 626s # connection timeout 626s try: 626s conn = await self._connector.connect( 626s req, traces=traces, timeout=real_timeout 626s ) 626s except asyncio.TimeoutError as exc: 626s > raise ConnectionTimeoutError( 626s f"Connection timeout to host {url}" 626s ) from exc 626s E aiohttp.client_exceptions.ConnectionTimeoutError: Connection timeout to host https://media.githubusercontent.com/media/cBioPortal/datahub/55cd360/public/acc_2019/data_gene_panel_matrix.txt 626s 626s /usr/lib/python3/dist-packages/aiohttp/client.py:707: ConnectionTimeoutError 626s 626s The above exception was the direct cause of the following exception: 626s 626s def test_github_open_lfs_file(): 626s # test opening a git-lfs tracked file 626s > with fsspec.open( 626s "github://cBioPortal:datahub@55cd360" 626s "/public/acc_2019/data_gene_panel_matrix.txt", 626s block_size=0, 626s ) as f: 626s 626s /tmp/autopkgtest.keP0ZR/autopkgtest_tmp/implementations_tests/test_github.py:24: 626s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 626s /usr/lib/python3/dist-packages/fsspec/core.py:105: in __enter__ 626s f = self.fs.open(self.path, mode=mode) 626s /usr/lib/python3/dist-packages/fsspec/spec.py:1310: in open 626s f = self._open( 626s /usr/lib/python3/dist-packages/fsspec/implementations/github.py:261: in _open 626s return self.http_fs.open( 626s /usr/lib/python3/dist-packages/fsspec/spec.py:1310: in open 626s f = self._open( 626s /usr/lib/python3/dist-packages/fsspec/implementations/http.py:366: in _open 626s size = size or info.update(self.info(path, **kwargs)) or info["size"] 626s /usr/lib/python3/dist-packages/fsspec/asyn.py:118: in wrapper 626s return sync(self.loop, func, *args, **kwargs) 626s /usr/lib/python3/dist-packages/fsspec/asyn.py:103: in sync 626s raise return_result 626s /usr/lib/python3/dist-packages/fsspec/asyn.py:56: in _runner 626s result[0] = await coro 626s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 626s 626s self = 626s url = 'https://media.githubusercontent.com/media/cBioPortal/datahub/55cd360/public/acc_2019/data_gene_panel_matrix.txt' 626s kwargs = {}, info = {} 626s session = 626s policy = 'get' 626s 626s async def _info(self, url, **kwargs): 626s """Get info of URL 626s 626s Tries to access location via HEAD, and then GET methods, but does 626s not fetch the data. 626s 626s It is possible that the server does not supply any size information, in 626s which case size will be given as None (and certain operations on the 626s corresponding file will not work). 626s """ 626s info = {} 626s session = await self.set_session() 626s 626s for policy in ["head", "get"]: 626s try: 626s info.update( 626s await _file_info( 626s self.encode_url(url), 626s size_policy=policy, 626s session=session, 626s **self.kwargs, 626s **kwargs, 626s ) 626s ) 626s if info.get("size") is not None: 626s break 626s except Exception as exc: 626s if policy == "get": 626s # If get failed, then raise a FileNotFoundError 626s > raise FileNotFoundError(url) from exc 626s E FileNotFoundError: https://media.githubusercontent.com/media/cBioPortal/datahub/55cd360/public/acc_2019/data_gene_panel_matrix.txt 626s 626s /usr/lib/python3/dist-packages/fsspec/implementations/http.py:440: FileNotFoundError 626s =========================== short test summary info ============================ 626s FAILED implementations_tests/test_github.py::test_github_open_large_file - Fi... 626s FAILED implementations_tests/test_github.py::test_github_open_lfs_file - File... 626s ===== 2 failed, 892 passed, 115 skipped, 3 deselected in 524.34s (0:08:44) ===== 627s autopkgtest [08:52:54]: test fsspec-tests: -----------------------] 627s autopkgtest [08:52:54]: test fsspec-tests: - - - - - - - - - - results - - - - - - - - - - 627s fsspec-tests FAIL non-zero exit status 1 627s autopkgtest [08:52:54]: @@@@@@@@@@@@@@@@@@@@ summary 627s fsspec-tests FAIL non-zero exit status 1