0s autopkgtest [08:21:32]: starting date and time: 2025-06-30 08:21:32+0000 0s autopkgtest [08:21:32]: git checkout: 508d4a25 a-v-ssh wait_for_ssh: demote "ssh connection failed" to a debug message 0s autopkgtest [08:21:32]: host juju-7f2275-prod-proposed-migration-environment-15; command line: /home/ubuntu/autopkgtest/runner/autopkgtest --output-dir /tmp/autopkgtest-work.wra3k0h3/out --timeout-copy=6000 --setup-commands /home/ubuntu/autopkgtest-cloud/worker-config-production/setup-canonical.sh --apt-pocket=proposed=src:fsspec --apt-upgrade fsspec --timeout-short=300 --timeout-copy=20000 --timeout-build=20000 --env=ADT_TEST_TRIGGERS=fsspec/2025.3.2-1 -- ssh -s /home/ubuntu/autopkgtest/ssh-setup/nova -- --flavor autopkgtest-cpu2-ram4-disk20-s390x --security-groups autopkgtest-juju-7f2275-prod-proposed-migration-environment-15@sto01-s390x-14.secgroup --name adt-questing-s390x-fsspec-20250630-082131-juju-7f2275-prod-proposed-migration-environment-15-c6074c03-7801-42c1-8e64-5685ab9b4625 --image adt/ubuntu-questing-s390x-server --keyname testbed-juju-7f2275-prod-proposed-migration-environment-15 --net-id=net_prod-autopkgtest-workers-s390x -e TERM=linux --mirror=http://ftpmaster.internal/ubuntu/ 3s Creating nova instance adt-questing-s390x-fsspec-20250630-082131-juju-7f2275-prod-proposed-migration-environment-15-c6074c03-7801-42c1-8e64-5685ab9b4625 from image adt/ubuntu-questing-s390x-server-20250629.img (UUID b1347037-8375-4d72-95d3-b07748484dde)... 67s autopkgtest [08:22:39]: testbed dpkg architecture: s390x 67s autopkgtest [08:22:39]: testbed apt version: 3.1.2 68s autopkgtest [08:22:40]: @@@@@@@@@@@@@@@@@@@@ test bed setup 68s autopkgtest [08:22:40]: testbed release detected to be: None 69s autopkgtest [08:22:41]: updating testbed package index (apt update) 69s Get:1 http://ftpmaster.internal/ubuntu questing-proposed InRelease [249 kB] 69s Hit:2 http://ftpmaster.internal/ubuntu questing InRelease 69s Hit:3 http://ftpmaster.internal/ubuntu questing-updates InRelease 69s Hit:4 http://ftpmaster.internal/ubuntu questing-security InRelease 69s Get:5 http://ftpmaster.internal/ubuntu questing-proposed/main Sources [26.6 kB] 70s Get:6 http://ftpmaster.internal/ubuntu questing-proposed/multiverse Sources [17.5 kB] 70s Get:7 http://ftpmaster.internal/ubuntu questing-proposed/universe Sources [429 kB] 70s Get:8 http://ftpmaster.internal/ubuntu questing-proposed/main s390x Packages [30.9 kB] 70s Get:9 http://ftpmaster.internal/ubuntu questing-proposed/universe s390x Packages [377 kB] 70s Get:10 http://ftpmaster.internal/ubuntu questing-proposed/multiverse s390x Packages [5252 B] 70s Fetched 1136 kB in 1s (1027 kB/s) 75s Reading package lists... 76s autopkgtest [08:22:48]: upgrading testbed (apt dist-upgrade and autopurge) 76s Reading package lists... 77s Building dependency tree... 77s Reading state information... 77s Calculating upgrade... 77s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 78s Reading package lists... 79s Building dependency tree... 79s Reading state information... 79s Solving dependencies... 79s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 82s autopkgtest [08:22:54]: testbed running kernel: Linux 6.15.0-3-generic #3-Ubuntu SMP Wed Jun 4 07:31:50 UTC 2025 83s autopkgtest [08:22:55]: @@@@@@@@@@@@@@@@@@@@ apt-source fsspec 84s Get:1 http://ftpmaster.internal/ubuntu questing-proposed/universe fsspec 2025.3.2-1 (dsc) [2580 B] 84s Get:2 http://ftpmaster.internal/ubuntu questing-proposed/universe fsspec 2025.3.2-1 (tar) [432 kB] 84s Get:3 http://ftpmaster.internal/ubuntu questing-proposed/universe fsspec 2025.3.2-1 (diff) [7208 B] 85s gpgv: Signature made Fri Apr 4 17:43:51 2025 UTC 85s gpgv: using RSA key 13796755BBC72BB8ABE2AEB5FA9DEC5DE11C63F1 85s gpgv: issuer "eamanu@debian.org" 85s gpgv: Can't check signature: No public key 85s dpkg-source: warning: cannot verify inline signature for ./fsspec_2025.3.2-1.dsc: no acceptable signature found 85s autopkgtest [08:22:57]: testing package fsspec version 2025.3.2-1 85s autopkgtest [08:22:57]: build not needed 87s autopkgtest [08:22:59]: test fsspec-tests: preparing testbed 87s Reading package lists... 87s Building dependency tree... 87s Reading state information... 88s Solving dependencies... 88s The following NEW packages will be installed: 88s fonts-font-awesome fonts-lato libblas3 libgfortran5 libjs-jquery 88s libjs-sphinxdoc libjs-underscore liblapack3 python-fsspec-doc 88s python3-aiohappyeyeballs python3-aiohttp python3-aiosignal python3-all 88s python3-async-generator python3-async-timeout python3-frozenlist 88s python3-fsspec python3-iniconfig python3-multidict python3-numpy 88s python3-numpy-dev python3-pluggy python3-propcache python3-pytest 88s python3-pytest-asyncio python3-pytest-mock python3-pytest-vcr python3-tqdm 88s python3-vcr python3-wrapt python3-yarl sphinx-rtd-theme-common 88s 0 upgraded, 32 newly installed, 0 to remove and 0 not upgraded. 88s Need to get 14.9 MB of archives. 88s After this operation, 67.7 MB of additional disk space will be used. 88s Get:1 http://ftpmaster.internal/ubuntu questing/main s390x fonts-lato all 2.015-1 [2781 kB] 88s Get:2 http://ftpmaster.internal/ubuntu questing/main s390x python3-numpy-dev s390x 1:2.2.4+ds-1ubuntu1 [147 kB] 88s Get:3 http://ftpmaster.internal/ubuntu questing/main s390x libblas3 s390x 3.12.1-2build1 [252 kB] 88s Get:4 http://ftpmaster.internal/ubuntu questing/main s390x libgfortran5 s390x 15.1.0-8ubuntu1 [620 kB] 89s Get:5 http://ftpmaster.internal/ubuntu questing/main s390x liblapack3 s390x 3.12.1-2build1 [2970 kB] 89s Get:6 http://ftpmaster.internal/ubuntu questing/main s390x python3-numpy s390x 1:2.2.4+ds-1ubuntu1 [4399 kB] 89s Get:7 http://ftpmaster.internal/ubuntu questing/main s390x fonts-font-awesome all 5.0.10+really4.7.0~dfsg-4.1 [516 kB] 89s Get:8 http://ftpmaster.internal/ubuntu questing/main s390x libjs-jquery all 3.6.1+dfsg+~3.5.14-1 [328 kB] 89s Get:9 http://ftpmaster.internal/ubuntu questing/main s390x libjs-underscore all 1.13.4~dfsg+~1.11.4-3 [118 kB] 89s Get:10 http://ftpmaster.internal/ubuntu questing/main s390x libjs-sphinxdoc all 8.2.3-1ubuntu2 [28.0 kB] 89s Get:11 http://ftpmaster.internal/ubuntu questing/main s390x sphinx-rtd-theme-common all 3.0.2+dfsg-3 [1013 kB] 89s Get:12 http://ftpmaster.internal/ubuntu questing-proposed/universe s390x python-fsspec-doc all 2025.3.2-1 [321 kB] 89s Get:13 http://ftpmaster.internal/ubuntu questing/universe s390x python3-aiohappyeyeballs all 2.6.1-1 [11.1 kB] 90s Get:14 http://ftpmaster.internal/ubuntu questing/universe s390x python3-multidict s390x 6.4.3-1 [49.6 kB] 90s Get:15 http://ftpmaster.internal/ubuntu questing/universe s390x python3-propcache s390x 0.3.1-1 [41.1 kB] 90s Get:16 http://ftpmaster.internal/ubuntu questing/universe s390x python3-yarl s390x 1.19.0-1 [88.1 kB] 90s Get:17 http://ftpmaster.internal/ubuntu questing/universe s390x python3-async-timeout all 5.0.1-1 [6830 B] 90s Get:18 http://ftpmaster.internal/ubuntu questing/universe s390x python3-frozenlist s390x 1.6.0-1 [101 kB] 90s Get:19 http://ftpmaster.internal/ubuntu questing/universe s390x python3-aiosignal all 1.3.2-1 [5182 B] 90s Get:20 http://ftpmaster.internal/ubuntu questing/universe s390x python3-aiohttp s390x 3.11.16-1 [369 kB] 90s Get:21 http://ftpmaster.internal/ubuntu questing/main s390x python3-all s390x 3.13.4-1 [880 B] 90s Get:22 http://ftpmaster.internal/ubuntu questing/universe s390x python3-async-generator all 1.10-4 [17.5 kB] 90s Get:23 http://ftpmaster.internal/ubuntu questing-proposed/universe s390x python3-fsspec all 2025.3.2-1 [217 kB] 90s Get:24 http://ftpmaster.internal/ubuntu questing/universe s390x python3-iniconfig all 1.1.1-2 [6024 B] 90s Get:25 http://ftpmaster.internal/ubuntu questing/universe s390x python3-pluggy all 1.5.0-1 [21.0 kB] 90s Get:26 http://ftpmaster.internal/ubuntu questing/universe s390x python3-pytest all 8.3.5-2 [252 kB] 90s Get:27 http://ftpmaster.internal/ubuntu questing/universe s390x python3-pytest-asyncio all 0.25.1-1 [17.0 kB] 90s Get:28 http://ftpmaster.internal/ubuntu questing/universe s390x python3-pytest-mock all 3.14.0-2 [11.7 kB] 90s Get:29 http://ftpmaster.internal/ubuntu questing/main s390x python3-wrapt s390x 1.15.0-4build1 [34.5 kB] 90s Get:30 http://ftpmaster.internal/ubuntu questing/universe s390x python3-vcr all 7.0.0-2 [33.3 kB] 90s Get:31 http://ftpmaster.internal/ubuntu questing/universe s390x python3-pytest-vcr all 1.0.2-4 [5228 B] 90s Get:32 http://ftpmaster.internal/ubuntu questing/universe s390x python3-tqdm all 4.67.1-5 [92.1 kB] 92s Fetched 14.9 MB in 2s (8798 kB/s) 92s Selecting previously unselected package fonts-lato. 93s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 85958 files and directories currently installed.) 93s Preparing to unpack .../00-fonts-lato_2.015-1_all.deb ... 93s Unpacking fonts-lato (2.015-1) ... 93s Selecting previously unselected package python3-numpy-dev:s390x. 94s Preparing to unpack .../01-python3-numpy-dev_1%3a2.2.4+ds-1ubuntu1_s390x.deb ... 94s Unpacking python3-numpy-dev:s390x (1:2.2.4+ds-1ubuntu1) ... 94s Selecting previously unselected package libblas3:s390x. 94s Preparing to unpack .../02-libblas3_3.12.1-2build1_s390x.deb ... 94s Unpacking libblas3:s390x (3.12.1-2build1) ... 94s Selecting previously unselected package libgfortran5:s390x. 94s Preparing to unpack .../03-libgfortran5_15.1.0-8ubuntu1_s390x.deb ... 94s Unpacking libgfortran5:s390x (15.1.0-8ubuntu1) ... 94s Selecting previously unselected package liblapack3:s390x. 94s Preparing to unpack .../04-liblapack3_3.12.1-2build1_s390x.deb ... 94s Unpacking liblapack3:s390x (3.12.1-2build1) ... 94s Selecting previously unselected package python3-numpy. 94s Preparing to unpack .../05-python3-numpy_1%3a2.2.4+ds-1ubuntu1_s390x.deb ... 94s Unpacking python3-numpy (1:2.2.4+ds-1ubuntu1) ... 95s Selecting previously unselected package fonts-font-awesome. 95s Preparing to unpack .../06-fonts-font-awesome_5.0.10+really4.7.0~dfsg-4.1_all.deb ... 95s Unpacking fonts-font-awesome (5.0.10+really4.7.0~dfsg-4.1) ... 95s Selecting previously unselected package libjs-jquery. 95s Preparing to unpack .../07-libjs-jquery_3.6.1+dfsg+~3.5.14-1_all.deb ... 95s Unpacking libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 95s Selecting previously unselected package libjs-underscore. 95s Preparing to unpack .../08-libjs-underscore_1.13.4~dfsg+~1.11.4-3_all.deb ... 95s Unpacking libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... 95s Selecting previously unselected package libjs-sphinxdoc. 95s Preparing to unpack .../09-libjs-sphinxdoc_8.2.3-1ubuntu2_all.deb ... 95s Unpacking libjs-sphinxdoc (8.2.3-1ubuntu2) ... 95s Selecting previously unselected package sphinx-rtd-theme-common. 95s Preparing to unpack .../10-sphinx-rtd-theme-common_3.0.2+dfsg-3_all.deb ... 95s Unpacking sphinx-rtd-theme-common (3.0.2+dfsg-3) ... 95s Selecting previously unselected package python-fsspec-doc. 95s Preparing to unpack .../11-python-fsspec-doc_2025.3.2-1_all.deb ... 95s Unpacking python-fsspec-doc (2025.3.2-1) ... 95s Selecting previously unselected package python3-aiohappyeyeballs. 95s Preparing to unpack .../12-python3-aiohappyeyeballs_2.6.1-1_all.deb ... 95s Unpacking python3-aiohappyeyeballs (2.6.1-1) ... 95s Selecting previously unselected package python3-multidict. 95s Preparing to unpack .../13-python3-multidict_6.4.3-1_s390x.deb ... 95s Unpacking python3-multidict (6.4.3-1) ... 95s Selecting previously unselected package python3-propcache. 95s Preparing to unpack .../14-python3-propcache_0.3.1-1_s390x.deb ... 95s Unpacking python3-propcache (0.3.1-1) ... 95s Selecting previously unselected package python3-yarl. 95s Preparing to unpack .../15-python3-yarl_1.19.0-1_s390x.deb ... 95s Unpacking python3-yarl (1.19.0-1) ... 95s Selecting previously unselected package python3-async-timeout. 95s Preparing to unpack .../16-python3-async-timeout_5.0.1-1_all.deb ... 95s Unpacking python3-async-timeout (5.0.1-1) ... 96s Selecting previously unselected package python3-frozenlist. 96s Preparing to unpack .../17-python3-frozenlist_1.6.0-1_s390x.deb ... 96s Unpacking python3-frozenlist (1.6.0-1) ... 96s Selecting previously unselected package python3-aiosignal. 96s Preparing to unpack .../18-python3-aiosignal_1.3.2-1_all.deb ... 96s Unpacking python3-aiosignal (1.3.2-1) ... 96s Selecting previously unselected package python3-aiohttp. 96s Preparing to unpack .../19-python3-aiohttp_3.11.16-1_s390x.deb ... 96s Unpacking python3-aiohttp (3.11.16-1) ... 96s Selecting previously unselected package python3-all. 96s Preparing to unpack .../20-python3-all_3.13.4-1_s390x.deb ... 96s Unpacking python3-all (3.13.4-1) ... 96s Selecting previously unselected package python3-async-generator. 96s Preparing to unpack .../21-python3-async-generator_1.10-4_all.deb ... 96s Unpacking python3-async-generator (1.10-4) ... 96s Selecting previously unselected package python3-fsspec. 96s Preparing to unpack .../22-python3-fsspec_2025.3.2-1_all.deb ... 96s Unpacking python3-fsspec (2025.3.2-1) ... 96s Selecting previously unselected package python3-iniconfig. 96s Preparing to unpack .../23-python3-iniconfig_1.1.1-2_all.deb ... 96s Unpacking python3-iniconfig (1.1.1-2) ... 96s Selecting previously unselected package python3-pluggy. 96s Preparing to unpack .../24-python3-pluggy_1.5.0-1_all.deb ... 96s Unpacking python3-pluggy (1.5.0-1) ... 96s Selecting previously unselected package python3-pytest. 96s Preparing to unpack .../25-python3-pytest_8.3.5-2_all.deb ... 96s Unpacking python3-pytest (8.3.5-2) ... 96s Selecting previously unselected package python3-pytest-asyncio. 96s Preparing to unpack .../26-python3-pytest-asyncio_0.25.1-1_all.deb ... 96s Unpacking python3-pytest-asyncio (0.25.1-1) ... 96s Selecting previously unselected package python3-pytest-mock. 96s Preparing to unpack .../27-python3-pytest-mock_3.14.0-2_all.deb ... 96s Unpacking python3-pytest-mock (3.14.0-2) ... 96s Selecting previously unselected package python3-wrapt. 96s Preparing to unpack .../28-python3-wrapt_1.15.0-4build1_s390x.deb ... 96s Unpacking python3-wrapt (1.15.0-4build1) ... 96s Selecting previously unselected package python3-vcr. 97s Preparing to unpack .../29-python3-vcr_7.0.0-2_all.deb ... 97s Unpacking python3-vcr (7.0.0-2) ... 97s Selecting previously unselected package python3-pytest-vcr. 97s Preparing to unpack .../30-python3-pytest-vcr_1.0.2-4_all.deb ... 97s Unpacking python3-pytest-vcr (1.0.2-4) ... 97s Selecting previously unselected package python3-tqdm. 97s Preparing to unpack .../31-python3-tqdm_4.67.1-5_all.deb ... 97s Unpacking python3-tqdm (4.67.1-5) ... 97s Setting up python3-iniconfig (1.1.1-2) ... 97s Setting up fonts-lato (2.015-1) ... 97s Setting up python3-async-generator (1.10-4) ... 98s Setting up python3-fsspec (2025.3.2-1) ... 99s Setting up python3-tqdm (4.67.1-5) ... 99s Setting up python3-all (3.13.4-1) ... 99s Setting up python3-multidict (6.4.3-1) ... 100s Setting up python3-frozenlist (1.6.0-1) ... 100s Setting up python3-aiosignal (1.3.2-1) ... 100s Setting up python3-async-timeout (5.0.1-1) ... 101s Setting up libblas3:s390x (3.12.1-2build1) ... 101s update-alternatives: using /usr/lib/s390x-linux-gnu/blas/libblas.so.3 to provide /usr/lib/s390x-linux-gnu/libblas.so.3 (libblas.so.3-s390x-linux-gnu) in auto mode 101s Setting up python3-numpy-dev:s390x (1:2.2.4+ds-1ubuntu1) ... 101s Setting up python3-wrapt (1.15.0-4build1) ... 101s Setting up python3-aiohappyeyeballs (2.6.1-1) ... 102s Setting up libgfortran5:s390x (15.1.0-8ubuntu1) ... 102s Setting up python3-pluggy (1.5.0-1) ... 102s Setting up python3-propcache (0.3.1-1) ... 102s Setting up libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 102s Setting up python3-yarl (1.19.0-1) ... 102s Setting up fonts-font-awesome (5.0.10+really4.7.0~dfsg-4.1) ... 102s Setting up sphinx-rtd-theme-common (3.0.2+dfsg-3) ... 102s Setting up libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... 102s Setting up liblapack3:s390x (3.12.1-2build1) ... 102s update-alternatives: using /usr/lib/s390x-linux-gnu/lapack/liblapack.so.3 to provide /usr/lib/s390x-linux-gnu/liblapack.so.3 (liblapack.so.3-s390x-linux-gnu) in auto mode 102s Setting up python3-pytest (8.3.5-2) ... 103s Setting up python3-aiohttp (3.11.16-1) ... 104s Setting up python3-vcr (7.0.0-2) ... 104s Setting up python3-numpy (1:2.2.4+ds-1ubuntu1) ... 110s Setting up libjs-sphinxdoc (8.2.3-1ubuntu2) ... 110s Setting up python3-pytest-asyncio (0.25.1-1) ... 111s Setting up python3-pytest-mock (3.14.0-2) ... 111s Setting up python3-pytest-vcr (1.0.2-4) ... 111s Setting up python-fsspec-doc (2025.3.2-1) ... 111s Processing triggers for man-db (2.13.1-1) ... 112s Processing triggers for libc-bin (2.41-6ubuntu2) ... 113s autopkgtest [08:23:25]: test fsspec-tests: [----------------------- 113s 'fsspec/tests' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/tests' 113s 'fsspec/tests/__init__.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/tests/__init__.py' 113s 'fsspec/tests/abstract' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/tests/abstract' 113s 'fsspec/tests/abstract/__init__.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/tests/abstract/__init__.py' 113s 'fsspec/tests/abstract/common.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/tests/abstract/common.py' 113s 'fsspec/tests/abstract/copy.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/tests/abstract/copy.py' 113s 'fsspec/tests/abstract/get.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/tests/abstract/get.py' 113s 'fsspec/tests/abstract/mv.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/tests/abstract/mv.py' 113s 'fsspec/tests/abstract/open.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/tests/abstract/open.py' 113s 'fsspec/tests/abstract/pipe.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/tests/abstract/pipe.py' 113s 'fsspec/tests/abstract/put.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/tests/abstract/put.py' 113s 'fsspec/tests/data' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/tests/data' 113s 'fsspec/tests/data/listing.html' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/tests/data/listing.html' 113s 'fsspec/tests/test_api.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/tests/test_api.py' 113s 'fsspec/tests/test_async.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/tests/test_async.py' 113s 'fsspec/tests/test_caches.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/tests/test_caches.py' 113s 'fsspec/tests/test_callbacks.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/tests/test_callbacks.py' 113s 'fsspec/tests/test_compression.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/tests/test_compression.py' 113s 'fsspec/tests/test_config.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/tests/test_config.py' 113s 'fsspec/tests/test_core.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/tests/test_core.py' 113s 'fsspec/tests/test_downstream.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/tests/test_downstream.py' 113s 'fsspec/tests/test_file.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/tests/test_file.py' 113s 'fsspec/tests/test_fuse.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/tests/test_fuse.py' 113s 'fsspec/tests/test_generic.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/tests/test_generic.py' 113s 'fsspec/tests/test_gui.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/tests/test_gui.py' 113s 'fsspec/tests/test_mapping.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/tests/test_mapping.py' 113s 'fsspec/tests/test_parquet.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/tests/test_parquet.py' 113s 'fsspec/tests/test_registry.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/tests/test_registry.py' 113s 'fsspec/tests/test_spec.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/tests/test_spec.py' 113s 'fsspec/tests/test_utils.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/tests/test_utils.py' 113s 'fsspec/tests/conftest.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/tests/conftest.py' 113s 'fsspec/implementations/tests' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests' 113s 'fsspec/implementations/tests/__init__.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/__init__.py' 113s 'fsspec/implementations/tests/cassettes' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/cassettes' 113s 'fsspec/implementations/tests/cassettes/test_dbfs' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/cassettes/test_dbfs' 113s 'fsspec/implementations/tests/cassettes/test_dbfs/test_dbfs_file_listing.yaml' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/cassettes/test_dbfs/test_dbfs_file_listing.yaml' 113s 'fsspec/implementations/tests/cassettes/test_dbfs/test_dbfs_mkdir.yaml' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/cassettes/test_dbfs/test_dbfs_mkdir.yaml' 113s 'fsspec/implementations/tests/cassettes/test_dbfs/test_dbfs_read_pyarrow_non_partitioned.yaml' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/cassettes/test_dbfs/test_dbfs_read_pyarrow_non_partitioned.yaml' 113s 'fsspec/implementations/tests/cassettes/test_dbfs/test_dbfs_read_range.yaml' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/cassettes/test_dbfs/test_dbfs_read_range.yaml' 113s 'fsspec/implementations/tests/cassettes/test_dbfs/test_dbfs_read_range_chunked.yaml' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/cassettes/test_dbfs/test_dbfs_read_range_chunked.yaml' 113s 'fsspec/implementations/tests/cassettes/test_dbfs/test_dbfs_write_and_read.yaml' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/cassettes/test_dbfs/test_dbfs_write_and_read.yaml' 113s 'fsspec/implementations/tests/cassettes/test_dbfs/test_dbfs_write_pyarrow_non_partitioned.yaml' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/cassettes/test_dbfs/test_dbfs_write_pyarrow_non_partitioned.yaml' 113s 'fsspec/implementations/tests/conftest.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/conftest.py' 113s 'fsspec/implementations/tests/ftp_tls.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/ftp_tls.py' 113s 'fsspec/implementations/tests/keycert.pem' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/keycert.pem' 113s 'fsspec/implementations/tests/local' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/local' 113s 'fsspec/implementations/tests/local/__init__.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/local/__init__.py' 113s 'fsspec/implementations/tests/local/local_fixtures.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/local/local_fixtures.py' 113s 'fsspec/implementations/tests/local/local_test.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/local/local_test.py' 113s 'fsspec/implementations/tests/memory' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/memory' 113s 'fsspec/implementations/tests/memory/__init__.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/memory/__init__.py' 113s 'fsspec/implementations/tests/memory/memory_fixtures.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/memory/memory_fixtures.py' 113s 'fsspec/implementations/tests/memory/memory_test.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/memory/memory_test.py' 113s 'fsspec/implementations/tests/out.zip' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/out.zip' 113s 'fsspec/implementations/tests/test_archive.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/test_archive.py' 113s 'fsspec/implementations/tests/test_arrow.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/test_arrow.py' 113s 'fsspec/implementations/tests/test_asyn_wrapper.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/test_asyn_wrapper.py' 113s 'fsspec/implementations/tests/test_cached.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/test_cached.py' 113s 'fsspec/implementations/tests/test_common.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/test_common.py' 113s 'fsspec/implementations/tests/test_dask.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/test_dask.py' 113s 'fsspec/implementations/tests/test_data.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/test_data.py' 113s 'fsspec/implementations/tests/test_dbfs.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/test_dbfs.py' 113s 'fsspec/implementations/tests/test_dirfs.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/test_dirfs.py' 113s 'fsspec/implementations/tests/test_ftp.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/test_ftp.py' 113s 'fsspec/implementations/tests/test_git.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/test_git.py' 113s 'fsspec/implementations/tests/test_github.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/test_github.py' 113s 'fsspec/implementations/tests/test_http.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/test_http.py' 113s 'fsspec/implementations/tests/test_http_sync.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/test_http_sync.py' 113s 'fsspec/implementations/tests/test_jupyter.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/test_jupyter.py' 113s 'fsspec/implementations/tests/test_libarchive.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/test_libarchive.py' 113s 'fsspec/implementations/tests/test_local.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/test_local.py' 113s 'fsspec/implementations/tests/test_memory.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/test_memory.py' 113s 'fsspec/implementations/tests/test_reference.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/test_reference.py' 113s 'fsspec/implementations/tests/test_sftp.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/test_sftp.py' 113s 'fsspec/implementations/tests/test_smb.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/test_smb.py' 113s 'fsspec/implementations/tests/test_tar.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/test_tar.py' 113s 'fsspec/implementations/tests/test_webhdfs.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/test_webhdfs.py' 113s 'fsspec/implementations/tests/test_zip.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/test_zip.py' 113s 'fsspec/conftest.py' -> '/tmp/autopkgtest.kj26U7/autopkgtest_tmp/conftest.py' 113s === python3.13 === 114s /usr/lib/python3/dist-packages/pytest_asyncio/plugin.py:207: PytestDeprecationWarning: The configuration option "asyncio_default_fixture_loop_scope" is unset. 114s The event loop scope for asynchronous fixtures will default to the fixture caching scope. Future versions of pytest-asyncio will default the loop scope for asynchronous fixtures to function scope. Set the default fixture loop scope explicitly in order to avoid unexpected behavior in the future. Valid fixture loop scopes are: "function", "class", "module", "package", "session" 114s 114s warnings.warn(PytestDeprecationWarning(_DEFAULT_FIXTURE_LOOP_SCOPE_UNSET)) 116s ============================= test session starts ============================== 116s platform linux -- Python 3.13.5, pytest-8.3.5, pluggy-1.5.0 116s rootdir: /tmp/autopkgtest.kj26U7/autopkgtest_tmp 116s plugins: asyncio-0.25.1, typeguard-4.4.2, mock-3.14.0, vcr-1.0.2 116s asyncio: mode=Mode.STRICT, asyncio_default_fixture_loop_scope=None 116s collected 790 items / 2 skipped 116s 116s tests/test_api.py ...............x...... [ 2%] 120s tests/test_async.py .........s... [ 4%] 121s tests/test_caches.py ................................................... [ 10%] 121s ........................................................................ [ 20%] 121s ....................... [ 22%] 121s tests/test_callbacks.py ........ [ 23%] 121s tests/test_compression.py ...sss [ 24%] 121s tests/test_config.py ....... [ 25%] 121s tests/test_core.py .................................................ss.. [ 32%] 121s sss.s [ 32%] 121s tests/test_file.py sssssssss.s [ 34%] 121s tests/test_generic.py ...... [ 35%] 121s tests/test_mapping.py ................. [ 37%] 121s tests/test_parquet.py ssssssssssssssssssssssssssssssssssssssssssssssssss [ 43%] 122s ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss [ 52%] 122s ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss [ 61%] 122s tests/test_registry.py ......s [ 62%] 152s tests/test_spec.py ....................x................................ [ 69%] 152s .....ssssssssss......................................................... [ 78%] 152s ........................................................................ [ 87%] 153s ................................. [ 91%] 153s tests/test_utils.py .................................................... [ 98%] 153s ............... [100%] 153s 153s =============================== warnings summary =============================== 153s tests/test_async.py::test_async_streamed_file_write 153s /usr/lib/python3.13/functools.py:77: RuntimeWarning: coroutine 'test_run_coros_in_chunks..runner' was never awaited 153s return partial(update_wrapper, wrapped=wrapped, 153s Enable tracemalloc to get traceback where the object was allocated. 153s See https://docs.pytest.org/en/stable/how-to/capture-warnings.html#resource-warnings for more info. 153s 153s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 153s =========== 565 passed, 225 skipped, 2 xfailed, 1 warning in 38.30s ============ 153s /usr/lib/python3/dist-packages/pytest_asyncio/plugin.py:207: PytestDeprecationWarning: The configuration option "asyncio_default_fixture_loop_scope" is unset. 153s The event loop scope for asynchronous fixtures will default to the fixture caching scope. Future versions of pytest-asyncio will default the loop scope for asynchronous fixtures to function scope. Set the default fixture loop scope explicitly in order to avoid unexpected behavior in the future. Valid fixture loop scopes are: "function", "class", "module", "package", "session" 153s 153s warnings.warn(PytestDeprecationWarning(_DEFAULT_FIXTURE_LOOP_SCOPE_UNSET)) 155s ============================= test session starts ============================== 155s platform linux -- Python 3.13.5, pytest-8.3.5, pluggy-1.5.0 155s rootdir: /tmp/autopkgtest.kj26U7/autopkgtest_tmp 155s plugins: asyncio-0.25.1, typeguard-4.4.2, mock-3.14.0, vcr-1.0.2 155s asyncio: mode=Mode.STRICT, asyncio_default_fixture_loop_scope=None 155s collected 1005 items / 3 deselected / 7 skipped / 1002 selected 155s 155s implementations_tests/local/local_test.py .............................. [ 2%] 156s ........................................................................ [ 10%] 156s ................................. [ 13%] 156s implementations_tests/memory/memory_test.py ............................ [ 16%] 157s ........................................................................ [ 23%] 157s ..................................... [ 27%] 159s implementations_tests/test_archive.py .................................. [ 30%] 169s ...................................................sssssssssssssssss [ 37%] 169s implementations_tests/test_asyn_wrapper.py ......... [ 38%] 177s implementations_tests/test_cached.py ..........ssssssss......sss........ [ 41%] 177s ..........ssssssssssssssss.s........ssss..................... [ 47%] 177s implementations_tests/test_common.py ssss [ 48%] 177s implementations_tests/test_data.py .. [ 48%] 177s implementations_tests/test_dirfs.py .................................... [ 51%] 177s ........................................................................ [ 59%] 178s .......................... [ 61%] 178s implementations_tests/test_ftp.py sssssssssssssssssss [ 63%] 676s implementations_tests/test_github.py .FF.. [ 64%] 676s implementations_tests/test_http.py ..................................... [ 67%] 677s .................... [ 69%] 677s implementations_tests/test_http_sync.py ................................ [ 73%] 678s ....... [ 73%] 678s implementations_tests/test_libarchive.py s [ 73%] 678s implementations_tests/test_local.py .s........................s......... [ 77%] 678s ....................................................ss........ss.sssss.. [ 84%] 678s .....sss....s.......................... [ 88%] 678s implementations_tests/test_memory.py .............................. [ 91%] 679s implementations_tests/test_reference.py ..................s.....ss..ssss [ 94%] 679s s [ 94%] 679s implementations_tests/test_tar.py ......................... [ 97%] 679s implementations_tests/test_webhdfs.py ssssssssssss [ 98%] 679s implementations_tests/test_zip.py ............... [100%] 679s 679s =================================== FAILURES =================================== 679s _________________________ test_github_open_large_file __________________________ 679s 679s self = , addr_infos = [] 679s req = 679s timeout = ClientTimeout(total=300, connect=None, sock_read=None, sock_connect=30, ceil_threshold=5) 679s client_error = 679s args = (functools.partial(, loop=<_UnixSelectorEventLoop running=True closed=False debug=False>),) 679s kwargs = {'server_hostname': 'raw.githubusercontent.com', 'ssl': } 679s 679s async def _wrap_create_connection( 679s self, 679s *args: Any, 679s addr_infos: List[aiohappyeyeballs.AddrInfoType], 679s req: ClientRequest, 679s timeout: "ClientTimeout", 679s client_error: Type[Exception] = ClientConnectorError, 679s **kwargs: Any, 679s ) -> Tuple[asyncio.Transport, ResponseHandler]: 679s try: 679s async with ceil_timeout( 679s timeout.sock_connect, ceil_threshold=timeout.ceil_threshold 679s ): 679s > sock = await aiohappyeyeballs.start_connection( 679s addr_infos=addr_infos, 679s local_addr_infos=self._local_addr_infos, 679s happy_eyeballs_delay=self._happy_eyeballs_delay, 679s interleave=self._interleave, 679s loop=self._loop, 679s ) 679s 679s /usr/lib/python3/dist-packages/aiohttp/connector.py:1115: 679s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 679s /usr/lib/python3/dist-packages/aiohappyeyeballs/impl.py:87: in start_connection 679s sock, _, _ = await _staggered.staggered_race( 679s /usr/lib/python3/dist-packages/aiohappyeyeballs/_staggered.py:165: in staggered_race 679s done = await _wait_one( 679s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 679s 679s futures = {.run_one_coro() done, defined at /usr/lib/python3/dist-packages/aiohappyeyeballs/_staggered.py:115> result=None>} 679s loop = <_UnixSelectorEventLoop running=True closed=False debug=False> 679s 679s async def _wait_one( 679s futures: "Iterable[asyncio.Future[Any]]", 679s loop: asyncio.AbstractEventLoop, 679s ) -> _T: 679s """Wait for the first future to complete.""" 679s wait_next = loop.create_future() 679s 679s def _on_completion(fut: "asyncio.Future[Any]") -> None: 679s if not wait_next.done(): 679s wait_next.set_result(fut) 679s 679s for f in futures: 679s f.add_done_callback(_on_completion) 679s 679s try: 679s > return await wait_next 679s E asyncio.exceptions.CancelledError 679s 679s /usr/lib/python3/dist-packages/aiohappyeyeballs/_staggered.py:46: CancelledError 679s 679s The above exception was the direct cause of the following exception: 679s 679s self = , method = 'GET' 679s str_or_url = URL('https://raw.githubusercontent.com/mwaskom/seaborn-data/83bfba7/brain_networks.csv') 679s 679s async def _request( 679s self, 679s method: str, 679s str_or_url: StrOrURL, 679s *, 679s params: Query = None, 679s data: Any = None, 679s json: Any = None, 679s cookies: Optional[LooseCookies] = None, 679s headers: Optional[LooseHeaders] = None, 679s skip_auto_headers: Optional[Iterable[str]] = None, 679s auth: Optional[BasicAuth] = None, 679s allow_redirects: bool = True, 679s max_redirects: int = 10, 679s compress: Union[str, bool, None] = None, 679s chunked: Optional[bool] = None, 679s expect100: bool = False, 679s raise_for_status: Union[ 679s None, bool, Callable[[ClientResponse], Awaitable[None]] 679s ] = None, 679s read_until_eof: bool = True, 679s proxy: Optional[StrOrURL] = None, 679s proxy_auth: Optional[BasicAuth] = None, 679s timeout: Union[ClientTimeout, _SENTINEL] = sentinel, 679s verify_ssl: Optional[bool] = None, 679s fingerprint: Optional[bytes] = None, 679s ssl_context: Optional[SSLContext] = None, 679s ssl: Union[SSLContext, bool, Fingerprint] = True, 679s server_hostname: Optional[str] = None, 679s proxy_headers: Optional[LooseHeaders] = None, 679s trace_request_ctx: Optional[Mapping[str, Any]] = None, 679s read_bufsize: Optional[int] = None, 679s auto_decompress: Optional[bool] = None, 679s max_line_size: Optional[int] = None, 679s max_field_size: Optional[int] = None, 679s ) -> ClientResponse: 679s 679s # NOTE: timeout clamps existing connect and read timeouts. We cannot 679s # set the default to None because we need to detect if the user wants 679s # to use the existing timeouts by setting timeout to None. 679s 679s if self.closed: 679s raise RuntimeError("Session is closed") 679s 679s ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) 679s 679s if data is not None and json is not None: 679s raise ValueError( 679s "data and json parameters can not be used at the same time" 679s ) 679s elif json is not None: 679s data = payload.JsonPayload(json, dumps=self._json_serialize) 679s 679s if not isinstance(chunked, bool) and chunked is not None: 679s warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) 679s 679s redirects = 0 679s history: List[ClientResponse] = [] 679s version = self._version 679s params = params or {} 679s 679s # Merge with default headers and transform to CIMultiDict 679s headers = self._prepare_headers(headers) 679s 679s try: 679s url = self._build_url(str_or_url) 679s except ValueError as e: 679s raise InvalidUrlClientError(str_or_url) from e 679s 679s assert self._connector is not None 679s if url.scheme not in self._connector.allowed_protocol_schema_set: 679s raise NonHttpUrlClientError(url) 679s 679s skip_headers: Optional[Iterable[istr]] 679s if skip_auto_headers is not None: 679s skip_headers = { 679s istr(i) for i in skip_auto_headers 679s } | self._skip_auto_headers 679s elif self._skip_auto_headers: 679s skip_headers = self._skip_auto_headers 679s else: 679s skip_headers = None 679s 679s if proxy is None: 679s proxy = self._default_proxy 679s if proxy_auth is None: 679s proxy_auth = self._default_proxy_auth 679s 679s if proxy is None: 679s proxy_headers = None 679s else: 679s proxy_headers = self._prepare_headers(proxy_headers) 679s try: 679s proxy = URL(proxy) 679s except ValueError as e: 679s raise InvalidURL(proxy) from e 679s 679s if timeout is sentinel: 679s real_timeout: ClientTimeout = self._timeout 679s else: 679s if not isinstance(timeout, ClientTimeout): 679s real_timeout = ClientTimeout(total=timeout) 679s else: 679s real_timeout = timeout 679s # timeout is cumulative for all request operations 679s # (request, redirects, responses, data consuming) 679s tm = TimeoutHandle( 679s self._loop, real_timeout.total, ceil_threshold=real_timeout.ceil_threshold 679s ) 679s handle = tm.start() 679s 679s if read_bufsize is None: 679s read_bufsize = self._read_bufsize 679s 679s if auto_decompress is None: 679s auto_decompress = self._auto_decompress 679s 679s if max_line_size is None: 679s max_line_size = self._max_line_size 679s 679s if max_field_size is None: 679s max_field_size = self._max_field_size 679s 679s traces = [ 679s Trace( 679s self, 679s trace_config, 679s trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), 679s ) 679s for trace_config in self._trace_configs 679s ] 679s 679s for trace in traces: 679s await trace.send_request_start(method, url.update_query(params), headers) 679s 679s timer = tm.timer() 679s try: 679s with timer: 679s # https://www.rfc-editor.org/rfc/rfc9112.html#name-retrying-requests 679s retry_persistent_connection = ( 679s self._retry_connection and method in IDEMPOTENT_METHODS 679s ) 679s while True: 679s url, auth_from_url = strip_auth_from_url(url) 679s if not url.raw_host: 679s # NOTE: Bail early, otherwise, causes `InvalidURL` through 679s # NOTE: `self._request_class()` below. 679s err_exc_cls = ( 679s InvalidUrlRedirectClientError 679s if redirects 679s else InvalidUrlClientError 679s ) 679s raise err_exc_cls(url) 679s # If `auth` was passed for an already authenticated URL, 679s # disallow only if this is the initial URL; this is to avoid issues 679s # with sketchy redirects that are not the caller's responsibility 679s if not history and (auth and auth_from_url): 679s raise ValueError( 679s "Cannot combine AUTH argument with " 679s "credentials encoded in URL" 679s ) 679s 679s # Override the auth with the one from the URL only if we 679s # have no auth, or if we got an auth from a redirect URL 679s if auth is None or (history and auth_from_url is not None): 679s auth = auth_from_url 679s 679s if ( 679s auth is None 679s and self._default_auth 679s and ( 679s not self._base_url or self._base_url_origin == url.origin() 679s ) 679s ): 679s auth = self._default_auth 679s # It would be confusing if we support explicit 679s # Authorization header with auth argument 679s if ( 679s headers is not None 679s and auth is not None 679s and hdrs.AUTHORIZATION in headers 679s ): 679s raise ValueError( 679s "Cannot combine AUTHORIZATION header " 679s "with AUTH argument or credentials " 679s "encoded in URL" 679s ) 679s 679s all_cookies = self._cookie_jar.filter_cookies(url) 679s 679s if cookies is not None: 679s tmp_cookie_jar = CookieJar( 679s quote_cookie=self._cookie_jar.quote_cookie 679s ) 679s tmp_cookie_jar.update_cookies(cookies) 679s req_cookies = tmp_cookie_jar.filter_cookies(url) 679s if req_cookies: 679s all_cookies.load(req_cookies) 679s 679s if proxy is not None: 679s proxy = URL(proxy) 679s elif self._trust_env: 679s with suppress(LookupError): 679s proxy, proxy_auth = get_env_proxy_for_url(url) 679s 679s req = self._request_class( 679s method, 679s url, 679s params=params, 679s headers=headers, 679s skip_auto_headers=skip_headers, 679s data=data, 679s cookies=all_cookies, 679s auth=auth, 679s version=version, 679s compress=compress, 679s chunked=chunked, 679s expect100=expect100, 679s loop=self._loop, 679s response_class=self._response_class, 679s proxy=proxy, 679s proxy_auth=proxy_auth, 679s timer=timer, 679s session=self, 679s ssl=ssl if ssl is not None else True, 679s server_hostname=server_hostname, 679s proxy_headers=proxy_headers, 679s traces=traces, 679s trust_env=self.trust_env, 679s ) 679s 679s # connection timeout 679s try: 679s > conn = await self._connector.connect( 679s req, traces=traces, timeout=real_timeout 679s ) 679s 679s /usr/lib/python3/dist-packages/aiohttp/client.py:703: 679s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 679s /usr/lib/python3/dist-packages/aiohttp/connector.py:548: in connect 679s proto = await self._create_connection(req, traces, timeout) 679s /usr/lib/python3/dist-packages/aiohttp/connector.py:1056: in _create_connection 679s _, proto = await self._create_direct_connection(req, traces, timeout) 679s /usr/lib/python3/dist-packages/aiohttp/connector.py:1400: in _create_direct_connection 679s raise last_exc 679s /usr/lib/python3/dist-packages/aiohttp/connector.py:1369: in _create_direct_connection 679s transp, proto = await self._wrap_create_connection( 679s /usr/lib/python3/dist-packages/aiohttp/connector.py:1112: in _wrap_create_connection 679s async with ceil_timeout( 679s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 679s 679s self = 679s exc_type = 679s exc_val = CancelledError(), exc_tb = 679s 679s async def __aexit__( 679s self, 679s exc_type: Optional[Type[BaseException]], 679s exc_val: Optional[BaseException], 679s exc_tb: Optional[TracebackType], 679s ) -> Optional[bool]: 679s assert self._state in (_State.ENTERED, _State.EXPIRING) 679s 679s if self._timeout_handler is not None: 679s self._timeout_handler.cancel() 679s self._timeout_handler = None 679s 679s if self._state is _State.EXPIRING: 679s self._state = _State.EXPIRED 679s 679s if self._task.uncancel() <= self._cancelling and exc_type is not None: 679s # Since there are no new cancel requests, we're 679s # handling this. 679s if issubclass(exc_type, exceptions.CancelledError): 679s > raise TimeoutError from exc_val 679s E TimeoutError 679s 679s /usr/lib/python3.13/asyncio/timeouts.py:116: TimeoutError 679s 679s The above exception was the direct cause of the following exception: 679s 679s self = 679s url = 'https://raw.githubusercontent.com/mwaskom/seaborn-data/83bfba7/brain_networks.csv' 679s kwargs = {}, info = {} 679s session = , policy = 'get' 679s 679s async def _info(self, url, **kwargs): 679s """Get info of URL 679s 679s Tries to access location via HEAD, and then GET methods, but does 679s not fetch the data. 679s 679s It is possible that the server does not supply any size information, in 679s which case size will be given as None (and certain operations on the 679s corresponding file will not work). 679s """ 679s info = {} 679s session = await self.set_session() 679s 679s for policy in ["head", "get"]: 679s try: 679s info.update( 679s > await _file_info( 679s self.encode_url(url), 679s size_policy=policy, 679s session=session, 679s **self.kwargs, 679s **kwargs, 679s ) 679s ) 679s 679s /usr/lib/python3/dist-packages/fsspec/implementations/http.py:427: 679s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 679s /usr/lib/python3/dist-packages/fsspec/implementations/http.py:839: in _file_info 679s r = await session.get(url, allow_redirects=ar, **kwargs) 679s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 679s 679s self = , method = 'GET' 679s str_or_url = URL('https://raw.githubusercontent.com/mwaskom/seaborn-data/83bfba7/brain_networks.csv') 679s 679s async def _request( 679s self, 679s method: str, 679s str_or_url: StrOrURL, 679s *, 679s params: Query = None, 679s data: Any = None, 679s json: Any = None, 679s cookies: Optional[LooseCookies] = None, 679s headers: Optional[LooseHeaders] = None, 679s skip_auto_headers: Optional[Iterable[str]] = None, 679s auth: Optional[BasicAuth] = None, 679s allow_redirects: bool = True, 679s max_redirects: int = 10, 679s compress: Union[str, bool, None] = None, 679s chunked: Optional[bool] = None, 679s expect100: bool = False, 679s raise_for_status: Union[ 679s None, bool, Callable[[ClientResponse], Awaitable[None]] 679s ] = None, 679s read_until_eof: bool = True, 679s proxy: Optional[StrOrURL] = None, 679s proxy_auth: Optional[BasicAuth] = None, 679s timeout: Union[ClientTimeout, _SENTINEL] = sentinel, 679s verify_ssl: Optional[bool] = None, 679s fingerprint: Optional[bytes] = None, 679s ssl_context: Optional[SSLContext] = None, 679s ssl: Union[SSLContext, bool, Fingerprint] = True, 679s server_hostname: Optional[str] = None, 679s proxy_headers: Optional[LooseHeaders] = None, 679s trace_request_ctx: Optional[Mapping[str, Any]] = None, 679s read_bufsize: Optional[int] = None, 679s auto_decompress: Optional[bool] = None, 679s max_line_size: Optional[int] = None, 679s max_field_size: Optional[int] = None, 679s ) -> ClientResponse: 679s 679s # NOTE: timeout clamps existing connect and read timeouts. We cannot 679s # set the default to None because we need to detect if the user wants 679s # to use the existing timeouts by setting timeout to None. 679s 679s if self.closed: 679s raise RuntimeError("Session is closed") 679s 679s ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) 679s 679s if data is not None and json is not None: 679s raise ValueError( 679s "data and json parameters can not be used at the same time" 679s ) 679s elif json is not None: 679s data = payload.JsonPayload(json, dumps=self._json_serialize) 679s 679s if not isinstance(chunked, bool) and chunked is not None: 679s warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) 679s 679s redirects = 0 679s history: List[ClientResponse] = [] 679s version = self._version 679s params = params or {} 679s 679s # Merge with default headers and transform to CIMultiDict 679s headers = self._prepare_headers(headers) 679s 679s try: 679s url = self._build_url(str_or_url) 679s except ValueError as e: 679s raise InvalidUrlClientError(str_or_url) from e 679s 679s assert self._connector is not None 679s if url.scheme not in self._connector.allowed_protocol_schema_set: 679s raise NonHttpUrlClientError(url) 679s 679s skip_headers: Optional[Iterable[istr]] 679s if skip_auto_headers is not None: 679s skip_headers = { 679s istr(i) for i in skip_auto_headers 679s } | self._skip_auto_headers 679s elif self._skip_auto_headers: 679s skip_headers = self._skip_auto_headers 679s else: 679s skip_headers = None 679s 679s if proxy is None: 679s proxy = self._default_proxy 679s if proxy_auth is None: 679s proxy_auth = self._default_proxy_auth 679s 679s if proxy is None: 679s proxy_headers = None 679s else: 679s proxy_headers = self._prepare_headers(proxy_headers) 679s try: 679s proxy = URL(proxy) 679s except ValueError as e: 679s raise InvalidURL(proxy) from e 679s 679s if timeout is sentinel: 679s real_timeout: ClientTimeout = self._timeout 679s else: 679s if not isinstance(timeout, ClientTimeout): 679s real_timeout = ClientTimeout(total=timeout) 679s else: 679s real_timeout = timeout 679s # timeout is cumulative for all request operations 679s # (request, redirects, responses, data consuming) 679s tm = TimeoutHandle( 679s self._loop, real_timeout.total, ceil_threshold=real_timeout.ceil_threshold 679s ) 679s handle = tm.start() 679s 679s if read_bufsize is None: 679s read_bufsize = self._read_bufsize 679s 679s if auto_decompress is None: 679s auto_decompress = self._auto_decompress 679s 679s if max_line_size is None: 679s max_line_size = self._max_line_size 679s 679s if max_field_size is None: 679s max_field_size = self._max_field_size 679s 679s traces = [ 679s Trace( 679s self, 679s trace_config, 679s trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), 679s ) 679s for trace_config in self._trace_configs 679s ] 679s 679s for trace in traces: 679s await trace.send_request_start(method, url.update_query(params), headers) 679s 679s timer = tm.timer() 679s try: 679s with timer: 679s # https://www.rfc-editor.org/rfc/rfc9112.html#name-retrying-requests 679s retry_persistent_connection = ( 679s self._retry_connection and method in IDEMPOTENT_METHODS 679s ) 679s while True: 679s url, auth_from_url = strip_auth_from_url(url) 679s if not url.raw_host: 679s # NOTE: Bail early, otherwise, causes `InvalidURL` through 679s # NOTE: `self._request_class()` below. 679s err_exc_cls = ( 679s InvalidUrlRedirectClientError 679s if redirects 679s else InvalidUrlClientError 679s ) 679s raise err_exc_cls(url) 679s # If `auth` was passed for an already authenticated URL, 679s # disallow only if this is the initial URL; this is to avoid issues 679s # with sketchy redirects that are not the caller's responsibility 679s if not history and (auth and auth_from_url): 679s raise ValueError( 679s "Cannot combine AUTH argument with " 679s "credentials encoded in URL" 679s ) 679s 679s # Override the auth with the one from the URL only if we 679s # have no auth, or if we got an auth from a redirect URL 679s if auth is None or (history and auth_from_url is not None): 679s auth = auth_from_url 679s 679s if ( 679s auth is None 679s and self._default_auth 679s and ( 679s not self._base_url or self._base_url_origin == url.origin() 679s ) 679s ): 679s auth = self._default_auth 679s # It would be confusing if we support explicit 679s # Authorization header with auth argument 679s if ( 679s headers is not None 679s and auth is not None 679s and hdrs.AUTHORIZATION in headers 679s ): 679s raise ValueError( 679s "Cannot combine AUTHORIZATION header " 679s "with AUTH argument or credentials " 679s "encoded in URL" 679s ) 679s 679s all_cookies = self._cookie_jar.filter_cookies(url) 679s 679s if cookies is not None: 679s tmp_cookie_jar = CookieJar( 679s quote_cookie=self._cookie_jar.quote_cookie 679s ) 679s tmp_cookie_jar.update_cookies(cookies) 679s req_cookies = tmp_cookie_jar.filter_cookies(url) 679s if req_cookies: 679s all_cookies.load(req_cookies) 679s 679s if proxy is not None: 679s proxy = URL(proxy) 679s elif self._trust_env: 679s with suppress(LookupError): 679s proxy, proxy_auth = get_env_proxy_for_url(url) 679s 679s req = self._request_class( 679s method, 679s url, 679s params=params, 679s headers=headers, 679s skip_auto_headers=skip_headers, 679s data=data, 679s cookies=all_cookies, 679s auth=auth, 679s version=version, 679s compress=compress, 679s chunked=chunked, 679s expect100=expect100, 679s loop=self._loop, 679s response_class=self._response_class, 679s proxy=proxy, 679s proxy_auth=proxy_auth, 679s timer=timer, 679s session=self, 679s ssl=ssl if ssl is not None else True, 679s server_hostname=server_hostname, 679s proxy_headers=proxy_headers, 679s traces=traces, 679s trust_env=self.trust_env, 679s ) 679s 679s # connection timeout 679s try: 679s conn = await self._connector.connect( 679s req, traces=traces, timeout=real_timeout 679s ) 679s except asyncio.TimeoutError as exc: 679s > raise ConnectionTimeoutError( 679s f"Connection timeout to host {url}" 679s ) from exc 679s E aiohttp.client_exceptions.ConnectionTimeoutError: Connection timeout to host https://raw.githubusercontent.com/mwaskom/seaborn-data/83bfba7/brain_networks.csv 679s 679s /usr/lib/python3/dist-packages/aiohttp/client.py:707: ConnectionTimeoutError 679s 679s The above exception was the direct cause of the following exception: 679s 679s def test_github_open_large_file(): 679s # test opening a large file >1 MB 679s # use block_size=0 to get a streaming interface to the file, ensuring that 679s # we fetch only the parts we need instead of downloading the full file all 679s # at once 679s > with fsspec.open( 679s "github://mwaskom:seaborn-data@83bfba7/brain_networks.csv", block_size=0 679s ) as f: 679s 679s /tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/test_github.py:15: 679s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 679s /usr/lib/python3/dist-packages/fsspec/core.py:105: in __enter__ 679s f = self.fs.open(self.path, mode=mode) 679s /usr/lib/python3/dist-packages/fsspec/spec.py:1310: in open 679s f = self._open( 679s /usr/lib/python3/dist-packages/fsspec/implementations/github.py:261: in _open 679s return self.http_fs.open( 679s /usr/lib/python3/dist-packages/fsspec/spec.py:1310: in open 679s f = self._open( 679s /usr/lib/python3/dist-packages/fsspec/implementations/http.py:366: in _open 679s size = size or info.update(self.info(path, **kwargs)) or info["size"] 679s /usr/lib/python3/dist-packages/fsspec/asyn.py:118: in wrapper 679s return sync(self.loop, func, *args, **kwargs) 679s /usr/lib/python3/dist-packages/fsspec/asyn.py:103: in sync 679s raise return_result 679s /usr/lib/python3/dist-packages/fsspec/asyn.py:56: in _runner 679s result[0] = await coro 679s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 679s 679s self = 679s url = 'https://raw.githubusercontent.com/mwaskom/seaborn-data/83bfba7/brain_networks.csv' 679s kwargs = {}, info = {} 679s session = , policy = 'get' 679s 679s async def _info(self, url, **kwargs): 679s """Get info of URL 679s 679s Tries to access location via HEAD, and then GET methods, but does 679s not fetch the data. 679s 679s It is possible that the server does not supply any size information, in 679s which case size will be given as None (and certain operations on the 679s corresponding file will not work). 679s """ 679s info = {} 679s session = await self.set_session() 679s 679s for policy in ["head", "get"]: 679s try: 679s info.update( 679s await _file_info( 679s self.encode_url(url), 679s size_policy=policy, 679s session=session, 679s **self.kwargs, 679s **kwargs, 679s ) 679s ) 679s if info.get("size") is not None: 679s break 679s except Exception as exc: 679s if policy == "get": 679s # If get failed, then raise a FileNotFoundError 679s > raise FileNotFoundError(url) from exc 679s E FileNotFoundError: https://raw.githubusercontent.com/mwaskom/seaborn-data/83bfba7/brain_networks.csv 679s 679s /usr/lib/python3/dist-packages/fsspec/implementations/http.py:440: FileNotFoundError 679s __________________________ test_github_open_lfs_file ___________________________ 679s 679s self = , addr_infos = [] 679s req = 679s timeout = ClientTimeout(total=300, connect=None, sock_read=None, sock_connect=30, ceil_threshold=5) 679s client_error = 679s args = (functools.partial(, loop=<_UnixSelectorEventLoop running=True closed=False debug=False>),) 679s kwargs = {'server_hostname': 'media.githubusercontent.com', 'ssl': } 679s 679s async def _wrap_create_connection( 679s self, 679s *args: Any, 679s addr_infos: List[aiohappyeyeballs.AddrInfoType], 679s req: ClientRequest, 679s timeout: "ClientTimeout", 679s client_error: Type[Exception] = ClientConnectorError, 679s **kwargs: Any, 679s ) -> Tuple[asyncio.Transport, ResponseHandler]: 679s try: 679s async with ceil_timeout( 679s timeout.sock_connect, ceil_threshold=timeout.ceil_threshold 679s ): 679s > sock = await aiohappyeyeballs.start_connection( 679s addr_infos=addr_infos, 679s local_addr_infos=self._local_addr_infos, 679s happy_eyeballs_delay=self._happy_eyeballs_delay, 679s interleave=self._interleave, 679s loop=self._loop, 679s ) 679s 679s /usr/lib/python3/dist-packages/aiohttp/connector.py:1115: 679s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 679s /usr/lib/python3/dist-packages/aiohappyeyeballs/impl.py:87: in start_connection 679s sock, _, _ = await _staggered.staggered_race( 679s /usr/lib/python3/dist-packages/aiohappyeyeballs/_staggered.py:165: in staggered_race 679s done = await _wait_one( 679s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 679s 679s futures = {.run_one_coro() done, defined at /usr/lib/python3/dist-packages/aiohappyeyeballs/_staggered.py:115> result=None>} 679s loop = <_UnixSelectorEventLoop running=True closed=False debug=False> 679s 679s async def _wait_one( 679s futures: "Iterable[asyncio.Future[Any]]", 679s loop: asyncio.AbstractEventLoop, 679s ) -> _T: 679s """Wait for the first future to complete.""" 679s wait_next = loop.create_future() 679s 679s def _on_completion(fut: "asyncio.Future[Any]") -> None: 679s if not wait_next.done(): 679s wait_next.set_result(fut) 679s 679s for f in futures: 679s f.add_done_callback(_on_completion) 679s 679s try: 679s > return await wait_next 679s E asyncio.exceptions.CancelledError 679s 679s /usr/lib/python3/dist-packages/aiohappyeyeballs/_staggered.py:46: CancelledError 679s 679s The above exception was the direct cause of the following exception: 679s 679s self = , method = 'GET' 679s str_or_url = URL('https://media.githubusercontent.com/media/cBioPortal/datahub/55cd360/public/acc_2019/data_gene_panel_matrix.txt') 679s 679s async def _request( 679s self, 679s method: str, 679s str_or_url: StrOrURL, 679s *, 679s params: Query = None, 679s data: Any = None, 679s json: Any = None, 679s cookies: Optional[LooseCookies] = None, 679s headers: Optional[LooseHeaders] = None, 679s skip_auto_headers: Optional[Iterable[str]] = None, 679s auth: Optional[BasicAuth] = None, 679s allow_redirects: bool = True, 679s max_redirects: int = 10, 679s compress: Union[str, bool, None] = None, 679s chunked: Optional[bool] = None, 679s expect100: bool = False, 679s raise_for_status: Union[ 679s None, bool, Callable[[ClientResponse], Awaitable[None]] 679s ] = None, 679s read_until_eof: bool = True, 679s proxy: Optional[StrOrURL] = None, 679s proxy_auth: Optional[BasicAuth] = None, 679s timeout: Union[ClientTimeout, _SENTINEL] = sentinel, 679s verify_ssl: Optional[bool] = None, 679s fingerprint: Optional[bytes] = None, 679s ssl_context: Optional[SSLContext] = None, 679s ssl: Union[SSLContext, bool, Fingerprint] = True, 679s server_hostname: Optional[str] = None, 679s proxy_headers: Optional[LooseHeaders] = None, 679s trace_request_ctx: Optional[Mapping[str, Any]] = None, 679s read_bufsize: Optional[int] = None, 679s auto_decompress: Optional[bool] = None, 679s max_line_size: Optional[int] = None, 679s max_field_size: Optional[int] = None, 679s ) -> ClientResponse: 679s 679s # NOTE: timeout clamps existing connect and read timeouts. We cannot 679s # set the default to None because we need to detect if the user wants 679s # to use the existing timeouts by setting timeout to None. 679s 679s if self.closed: 679s raise RuntimeError("Session is closed") 679s 679s ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) 679s 679s if data is not None and json is not None: 679s raise ValueError( 679s "data and json parameters can not be used at the same time" 679s ) 679s elif json is not None: 679s data = payload.JsonPayload(json, dumps=self._json_serialize) 679s 679s if not isinstance(chunked, bool) and chunked is not None: 679s warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) 679s 679s redirects = 0 679s history: List[ClientResponse] = [] 679s version = self._version 679s params = params or {} 679s 679s # Merge with default headers and transform to CIMultiDict 679s headers = self._prepare_headers(headers) 679s 679s try: 679s url = self._build_url(str_or_url) 679s except ValueError as e: 679s raise InvalidUrlClientError(str_or_url) from e 679s 679s assert self._connector is not None 679s if url.scheme not in self._connector.allowed_protocol_schema_set: 679s raise NonHttpUrlClientError(url) 679s 679s skip_headers: Optional[Iterable[istr]] 679s if skip_auto_headers is not None: 679s skip_headers = { 679s istr(i) for i in skip_auto_headers 679s } | self._skip_auto_headers 679s elif self._skip_auto_headers: 679s skip_headers = self._skip_auto_headers 679s else: 679s skip_headers = None 679s 679s if proxy is None: 679s proxy = self._default_proxy 679s if proxy_auth is None: 679s proxy_auth = self._default_proxy_auth 679s 679s if proxy is None: 679s proxy_headers = None 679s else: 679s proxy_headers = self._prepare_headers(proxy_headers) 679s try: 679s proxy = URL(proxy) 679s except ValueError as e: 679s raise InvalidURL(proxy) from e 679s 679s if timeout is sentinel: 679s real_timeout: ClientTimeout = self._timeout 679s else: 679s if not isinstance(timeout, ClientTimeout): 679s real_timeout = ClientTimeout(total=timeout) 679s else: 679s real_timeout = timeout 679s # timeout is cumulative for all request operations 679s # (request, redirects, responses, data consuming) 679s tm = TimeoutHandle( 679s self._loop, real_timeout.total, ceil_threshold=real_timeout.ceil_threshold 679s ) 679s handle = tm.start() 679s 679s if read_bufsize is None: 679s read_bufsize = self._read_bufsize 679s 679s if auto_decompress is None: 679s auto_decompress = self._auto_decompress 679s 679s if max_line_size is None: 679s max_line_size = self._max_line_size 679s 679s if max_field_size is None: 679s max_field_size = self._max_field_size 679s 679s traces = [ 679s Trace( 679s self, 679s trace_config, 679s trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), 679s ) 679s for trace_config in self._trace_configs 679s ] 679s 679s for trace in traces: 679s await trace.send_request_start(method, url.update_query(params), headers) 679s 679s timer = tm.timer() 679s try: 679s with timer: 679s # https://www.rfc-editor.org/rfc/rfc9112.html#name-retrying-requests 679s retry_persistent_connection = ( 679s self._retry_connection and method in IDEMPOTENT_METHODS 679s ) 679s while True: 679s url, auth_from_url = strip_auth_from_url(url) 679s if not url.raw_host: 679s # NOTE: Bail early, otherwise, causes `InvalidURL` through 679s # NOTE: `self._request_class()` below. 679s err_exc_cls = ( 679s InvalidUrlRedirectClientError 679s if redirects 679s else InvalidUrlClientError 679s ) 679s raise err_exc_cls(url) 679s # If `auth` was passed for an already authenticated URL, 679s # disallow only if this is the initial URL; this is to avoid issues 679s # with sketchy redirects that are not the caller's responsibility 679s if not history and (auth and auth_from_url): 679s raise ValueError( 679s "Cannot combine AUTH argument with " 679s "credentials encoded in URL" 679s ) 679s 679s # Override the auth with the one from the URL only if we 679s # have no auth, or if we got an auth from a redirect URL 679s if auth is None or (history and auth_from_url is not None): 679s auth = auth_from_url 679s 679s if ( 679s auth is None 679s and self._default_auth 679s and ( 679s not self._base_url or self._base_url_origin == url.origin() 679s ) 679s ): 679s auth = self._default_auth 679s # It would be confusing if we support explicit 679s # Authorization header with auth argument 679s if ( 679s headers is not None 679s and auth is not None 679s and hdrs.AUTHORIZATION in headers 679s ): 679s raise ValueError( 679s "Cannot combine AUTHORIZATION header " 679s "with AUTH argument or credentials " 679s "encoded in URL" 679s ) 679s 679s all_cookies = self._cookie_jar.filter_cookies(url) 679s 679s if cookies is not None: 679s tmp_cookie_jar = CookieJar( 679s quote_cookie=self._cookie_jar.quote_cookie 679s ) 679s tmp_cookie_jar.update_cookies(cookies) 679s req_cookies = tmp_cookie_jar.filter_cookies(url) 679s if req_cookies: 679s all_cookies.load(req_cookies) 679s 679s if proxy is not None: 679s proxy = URL(proxy) 679s elif self._trust_env: 679s with suppress(LookupError): 679s proxy, proxy_auth = get_env_proxy_for_url(url) 679s 679s req = self._request_class( 679s method, 679s url, 679s params=params, 679s headers=headers, 679s skip_auto_headers=skip_headers, 679s data=data, 679s cookies=all_cookies, 679s auth=auth, 679s version=version, 679s compress=compress, 679s chunked=chunked, 679s expect100=expect100, 679s loop=self._loop, 679s response_class=self._response_class, 679s proxy=proxy, 679s proxy_auth=proxy_auth, 679s timer=timer, 679s session=self, 679s ssl=ssl if ssl is not None else True, 679s server_hostname=server_hostname, 679s proxy_headers=proxy_headers, 679s traces=traces, 679s trust_env=self.trust_env, 679s ) 679s 679s # connection timeout 679s try: 679s > conn = await self._connector.connect( 679s req, traces=traces, timeout=real_timeout 679s ) 679s 679s /usr/lib/python3/dist-packages/aiohttp/client.py:703: 679s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 679s /usr/lib/python3/dist-packages/aiohttp/connector.py:548: in connect 679s proto = await self._create_connection(req, traces, timeout) 679s /usr/lib/python3/dist-packages/aiohttp/connector.py:1056: in _create_connection 679s _, proto = await self._create_direct_connection(req, traces, timeout) 679s /usr/lib/python3/dist-packages/aiohttp/connector.py:1400: in _create_direct_connection 679s raise last_exc 679s /usr/lib/python3/dist-packages/aiohttp/connector.py:1369: in _create_direct_connection 679s transp, proto = await self._wrap_create_connection( 679s /usr/lib/python3/dist-packages/aiohttp/connector.py:1112: in _wrap_create_connection 679s async with ceil_timeout( 679s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 679s 679s self = 679s exc_type = 679s exc_val = CancelledError(), exc_tb = 679s 679s async def __aexit__( 679s self, 679s exc_type: Optional[Type[BaseException]], 679s exc_val: Optional[BaseException], 679s exc_tb: Optional[TracebackType], 679s ) -> Optional[bool]: 679s assert self._state in (_State.ENTERED, _State.EXPIRING) 679s 679s if self._timeout_handler is not None: 679s self._timeout_handler.cancel() 679s self._timeout_handler = None 679s 679s if self._state is _State.EXPIRING: 679s self._state = _State.EXPIRED 679s 679s if self._task.uncancel() <= self._cancelling and exc_type is not None: 679s # Since there are no new cancel requests, we're 679s # handling this. 679s if issubclass(exc_type, exceptions.CancelledError): 679s > raise TimeoutError from exc_val 679s E TimeoutError 679s 679s /usr/lib/python3.13/asyncio/timeouts.py:116: TimeoutError 679s 679s The above exception was the direct cause of the following exception: 679s 679s self = 679s url = 'https://media.githubusercontent.com/media/cBioPortal/datahub/55cd360/public/acc_2019/data_gene_panel_matrix.txt' 679s kwargs = {}, info = {} 679s session = , policy = 'get' 679s 679s async def _info(self, url, **kwargs): 679s """Get info of URL 679s 679s Tries to access location via HEAD, and then GET methods, but does 679s not fetch the data. 679s 679s It is possible that the server does not supply any size information, in 679s which case size will be given as None (and certain operations on the 679s corresponding file will not work). 679s """ 679s info = {} 679s session = await self.set_session() 679s 679s for policy in ["head", "get"]: 679s try: 679s info.update( 679s > await _file_info( 679s self.encode_url(url), 679s size_policy=policy, 679s session=session, 679s **self.kwargs, 679s **kwargs, 679s ) 679s ) 679s 679s /usr/lib/python3/dist-packages/fsspec/implementations/http.py:427: 679s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 679s /usr/lib/python3/dist-packages/fsspec/implementations/http.py:839: in _file_info 679s r = await session.get(url, allow_redirects=ar, **kwargs) 679s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 679s 679s self = , method = 'GET' 679s str_or_url = URL('https://media.githubusercontent.com/media/cBioPortal/datahub/55cd360/public/acc_2019/data_gene_panel_matrix.txt') 679s 679s async def _request( 679s self, 679s method: str, 679s str_or_url: StrOrURL, 679s *, 679s params: Query = None, 679s data: Any = None, 679s json: Any = None, 679s cookies: Optional[LooseCookies] = None, 679s headers: Optional[LooseHeaders] = None, 679s skip_auto_headers: Optional[Iterable[str]] = None, 679s auth: Optional[BasicAuth] = None, 679s allow_redirects: bool = True, 679s max_redirects: int = 10, 679s compress: Union[str, bool, None] = None, 679s chunked: Optional[bool] = None, 679s expect100: bool = False, 679s raise_for_status: Union[ 679s None, bool, Callable[[ClientResponse], Awaitable[None]] 679s ] = None, 679s read_until_eof: bool = True, 679s proxy: Optional[StrOrURL] = None, 679s proxy_auth: Optional[BasicAuth] = None, 679s timeout: Union[ClientTimeout, _SENTINEL] = sentinel, 679s verify_ssl: Optional[bool] = None, 679s fingerprint: Optional[bytes] = None, 679s ssl_context: Optional[SSLContext] = None, 679s ssl: Union[SSLContext, bool, Fingerprint] = True, 679s server_hostname: Optional[str] = None, 679s proxy_headers: Optional[LooseHeaders] = None, 679s trace_request_ctx: Optional[Mapping[str, Any]] = None, 679s read_bufsize: Optional[int] = None, 679s auto_decompress: Optional[bool] = None, 679s max_line_size: Optional[int] = None, 679s max_field_size: Optional[int] = None, 679s ) -> ClientResponse: 679s 679s # NOTE: timeout clamps existing connect and read timeouts. We cannot 679s # set the default to None because we need to detect if the user wants 679s # to use the existing timeouts by setting timeout to None. 679s 679s if self.closed: 679s raise RuntimeError("Session is closed") 679s 679s ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) 679s 679s if data is not None and json is not None: 679s raise ValueError( 679s "data and json parameters can not be used at the same time" 679s ) 679s elif json is not None: 679s data = payload.JsonPayload(json, dumps=self._json_serialize) 679s 679s if not isinstance(chunked, bool) and chunked is not None: 679s warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) 679s 679s redirects = 0 679s history: List[ClientResponse] = [] 679s version = self._version 679s params = params or {} 679s 679s # Merge with default headers and transform to CIMultiDict 679s headers = self._prepare_headers(headers) 679s 679s try: 679s url = self._build_url(str_or_url) 679s except ValueError as e: 679s raise InvalidUrlClientError(str_or_url) from e 679s 679s assert self._connector is not None 679s if url.scheme not in self._connector.allowed_protocol_schema_set: 679s raise NonHttpUrlClientError(url) 679s 679s skip_headers: Optional[Iterable[istr]] 679s if skip_auto_headers is not None: 679s skip_headers = { 679s istr(i) for i in skip_auto_headers 679s } | self._skip_auto_headers 679s elif self._skip_auto_headers: 679s skip_headers = self._skip_auto_headers 679s else: 679s skip_headers = None 679s 679s if proxy is None: 679s proxy = self._default_proxy 679s if proxy_auth is None: 679s proxy_auth = self._default_proxy_auth 679s 679s if proxy is None: 679s proxy_headers = None 679s else: 679s proxy_headers = self._prepare_headers(proxy_headers) 679s try: 679s proxy = URL(proxy) 679s except ValueError as e: 679s raise InvalidURL(proxy) from e 679s 679s if timeout is sentinel: 679s real_timeout: ClientTimeout = self._timeout 679s else: 679s if not isinstance(timeout, ClientTimeout): 679s real_timeout = ClientTimeout(total=timeout) 679s else: 679s real_timeout = timeout 679s # timeout is cumulative for all request operations 679s # (request, redirects, responses, data consuming) 679s tm = TimeoutHandle( 679s self._loop, real_timeout.total, ceil_threshold=real_timeout.ceil_threshold 679s ) 679s handle = tm.start() 679s 679s if read_bufsize is None: 679s read_bufsize = self._read_bufsize 679s 679s if auto_decompress is None: 679s auto_decompress = self._auto_decompress 679s 679s if max_line_size is None: 679s max_line_size = self._max_line_size 679s 679s if max_field_size is None: 679s max_field_size = self._max_field_size 679s 679s traces = [ 679s Trace( 679s self, 679s trace_config, 679s trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), 679s ) 679s for trace_config in self._trace_configs 679s ] 679s 679s for trace in traces: 679s await trace.send_request_start(method, url.update_query(params), headers) 679s 679s timer = tm.timer() 679s try: 679s with timer: 679s # https://www.rfc-editor.org/rfc/rfc9112.html#name-retrying-requests 679s retry_persistent_connection = ( 679s self._retry_connection and method in IDEMPOTENT_METHODS 679s ) 679s while True: 679s url, auth_from_url = strip_auth_from_url(url) 679s if not url.raw_host: 679s # NOTE: Bail early, otherwise, causes `InvalidURL` through 679s # NOTE: `self._request_class()` below. 679s err_exc_cls = ( 679s InvalidUrlRedirectClientError 679s if redirects 679s else InvalidUrlClientError 679s ) 679s raise err_exc_cls(url) 679s # If `auth` was passed for an already authenticated URL, 679s # disallow only if this is the initial URL; this is to avoid issues 679s # with sketchy redirects that are not the caller's responsibility 679s if not history and (auth and auth_from_url): 679s raise ValueError( 679s "Cannot combine AUTH argument with " 679s "credentials encoded in URL" 679s ) 679s 679s # Override the auth with the one from the URL only if we 679s # have no auth, or if we got an auth from a redirect URL 679s if auth is None or (history and auth_from_url is not None): 679s auth = auth_from_url 679s 679s if ( 679s auth is None 679s and self._default_auth 679s and ( 679s not self._base_url or self._base_url_origin == url.origin() 679s ) 679s ): 679s auth = self._default_auth 679s # It would be confusing if we support explicit 679s # Authorization header with auth argument 679s if ( 679s headers is not None 679s and auth is not None 679s and hdrs.AUTHORIZATION in headers 679s ): 679s raise ValueError( 679s "Cannot combine AUTHORIZATION header " 679s "with AUTH argument or credentials " 679s "encoded in URL" 679s ) 679s 679s all_cookies = self._cookie_jar.filter_cookies(url) 679s 679s if cookies is not None: 679s tmp_cookie_jar = CookieJar( 679s quote_cookie=self._cookie_jar.quote_cookie 679s ) 679s tmp_cookie_jar.update_cookies(cookies) 679s req_cookies = tmp_cookie_jar.filter_cookies(url) 679s if req_cookies: 679s all_cookies.load(req_cookies) 679s 679s if proxy is not None: 679s proxy = URL(proxy) 679s elif self._trust_env: 679s with suppress(LookupError): 679s proxy, proxy_auth = get_env_proxy_for_url(url) 679s 679s req = self._request_class( 679s method, 679s url, 679s params=params, 679s headers=headers, 679s skip_auto_headers=skip_headers, 679s data=data, 679s cookies=all_cookies, 679s auth=auth, 679s version=version, 679s compress=compress, 679s chunked=chunked, 679s expect100=expect100, 679s loop=self._loop, 679s response_class=self._response_class, 679s proxy=proxy, 679s proxy_auth=proxy_auth, 679s timer=timer, 679s session=self, 679s ssl=ssl if ssl is not None else True, 679s server_hostname=server_hostname, 679s proxy_headers=proxy_headers, 679s traces=traces, 679s trust_env=self.trust_env, 679s ) 679s 679s # connection timeout 679s try: 679s conn = await self._connector.connect( 679s req, traces=traces, timeout=real_timeout 679s ) 679s except asyncio.TimeoutError as exc: 679s > raise ConnectionTimeoutError( 679s f"Connection timeout to host {url}" 679s ) from exc 679s E aiohttp.client_exceptions.ConnectionTimeoutError: Connection timeout to host https://media.githubusercontent.com/media/cBioPortal/datahub/55cd360/public/acc_2019/data_gene_panel_matrix.txt 679s 679s /usr/lib/python3/dist-packages/aiohttp/client.py:707: ConnectionTimeoutError 679s 679s The above exception was the direct cause of the following exception: 679s 679s def test_github_open_lfs_file(): 679s # test opening a git-lfs tracked file 679s > with fsspec.open( 679s "github://cBioPortal:datahub@55cd360" 679s "/public/acc_2019/data_gene_panel_matrix.txt", 679s block_size=0, 679s ) as f: 679s 679s /tmp/autopkgtest.kj26U7/autopkgtest_tmp/implementations_tests/test_github.py:24: 679s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 679s /usr/lib/python3/dist-packages/fsspec/core.py:105: in __enter__ 679s f = self.fs.open(self.path, mode=mode) 679s /usr/lib/python3/dist-packages/fsspec/spec.py:1310: in open 679s f = self._open( 679s /usr/lib/python3/dist-packages/fsspec/implementations/github.py:261: in _open 679s return self.http_fs.open( 679s /usr/lib/python3/dist-packages/fsspec/spec.py:1310: in open 679s f = self._open( 679s /usr/lib/python3/dist-packages/fsspec/implementations/http.py:366: in _open 679s size = size or info.update(self.info(path, **kwargs)) or info["size"] 679s /usr/lib/python3/dist-packages/fsspec/asyn.py:118: in wrapper 679s return sync(self.loop, func, *args, **kwargs) 679s /usr/lib/python3/dist-packages/fsspec/asyn.py:103: in sync 679s raise return_result 679s /usr/lib/python3/dist-packages/fsspec/asyn.py:56: in _runner 679s result[0] = await coro 679s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 679s 679s self = 679s url = 'https://media.githubusercontent.com/media/cBioPortal/datahub/55cd360/public/acc_2019/data_gene_panel_matrix.txt' 679s kwargs = {}, info = {} 679s session = , policy = 'get' 679s 679s async def _info(self, url, **kwargs): 679s """Get info of URL 679s 679s Tries to access location via HEAD, and then GET methods, but does 679s not fetch the data. 679s 679s It is possible that the server does not supply any size information, in 679s which case size will be given as None (and certain operations on the 679s corresponding file will not work). 679s """ 679s info = {} 679s session = await self.set_session() 679s 679s for policy in ["head", "get"]: 679s try: 679s info.update( 679s await _file_info( 679s self.encode_url(url), 679s size_policy=policy, 679s session=session, 679s **self.kwargs, 679s **kwargs, 679s ) 679s ) 679s if info.get("size") is not None: 679s break 679s except Exception as exc: 679s if policy == "get": 679s # If get failed, then raise a FileNotFoundError 679s > raise FileNotFoundError(url) from exc 679s E FileNotFoundError: https://media.githubusercontent.com/media/cBioPortal/datahub/55cd360/public/acc_2019/data_gene_panel_matrix.txt 679s 679s /usr/lib/python3/dist-packages/fsspec/implementations/http.py:440: FileNotFoundError 679s =========================== short test summary info ============================ 679s FAILED implementations_tests/test_github.py::test_github_open_large_file - Fi... 679s FAILED implementations_tests/test_github.py::test_github_open_lfs_file - File... 679s ===== 2 failed, 892 passed, 115 skipped, 3 deselected in 525.65s (0:08:45) ===== 680s autopkgtest [08:32:52]: test fsspec-tests: -----------------------] 680s autopkgtest [08:32:52]: test fsspec-tests: - - - - - - - - - - results - - - - - - - - - - 680s fsspec-tests FAIL non-zero exit status 1 680s autopkgtest [08:32:52]: @@@@@@@@@@@@@@@@@@@@ summary 680s fsspec-tests FAIL non-zero exit status 1