0s autopkgtest [16:58:55]: starting date and time: 2025-11-17 16:58:55+0000 0s autopkgtest [16:58:55]: git checkout: 508d4a25 a-v-ssh wait_for_ssh: demote "ssh connection failed" to a debug message 0s autopkgtest [16:58:55]: host juju-7f2275-prod-proposed-migration-environment-9; command line: /home/ubuntu/autopkgtest/runner/autopkgtest --output-dir /tmp/autopkgtest-work.ygmgod9l/out --timeout-copy=6000 --setup-commands 'ln -s /dev/null /etc/systemd/system/bluetooth.service; printf "http_proxy=http://squid.internal:3128\nhttps_proxy=http://squid.internal:3128\nno_proxy=127.0.0.1,127.0.1.1,localhost,localdomain,internal,login.ubuntu.com,archive.ubuntu.com,ports.ubuntu.com,security.ubuntu.com,ddebs.ubuntu.com,changelogs.ubuntu.com,keyserver.ubuntu.com,launchpadlibrarian.net,launchpadcontent.net,launchpad.net,keystone.ps5.canonical.com,objectstorage.prodstack5.canonical.com,radosgw.ps5.canonical.com\n" >> /etc/environment' --apt-pocket=proposed=src:intake --apt-upgrade intake --timeout-short=300 --timeout-copy=20000 --timeout-build=20000 --env=ADT_TEST_TRIGGERS=intake/0.6.6-4 -- lxd -r lxd-armhf-10.145.243.58 lxd-armhf-10.145.243.58:autopkgtest/ubuntu/resolute/armhf 21s autopkgtest [16:59:16]: testbed dpkg architecture: armhf 23s autopkgtest [16:59:18]: testbed apt version: 3.1.11 27s autopkgtest [16:59:22]: @@@@@@@@@@@@@@@@@@@@ test bed setup 30s autopkgtest [16:59:25]: testbed release detected to be: None 37s autopkgtest [16:59:32]: updating testbed package index (apt update) 39s Get:1 http://ftpmaster.internal/ubuntu resolute-proposed InRelease [87.8 kB] 39s Get:2 http://ftpmaster.internal/ubuntu resolute InRelease [87.8 kB] 40s Hit:3 http://ftpmaster.internal/ubuntu resolute-updates InRelease 40s Hit:4 http://ftpmaster.internal/ubuntu resolute-security InRelease 40s Get:5 http://ftpmaster.internal/ubuntu resolute-proposed/restricted Sources [9852 B] 40s Get:6 http://ftpmaster.internal/ubuntu resolute-proposed/main Sources [73.2 kB] 40s Get:7 http://ftpmaster.internal/ubuntu resolute-proposed/universe Sources [779 kB] 40s Get:8 http://ftpmaster.internal/ubuntu resolute-proposed/multiverse Sources [22.9 kB] 40s Get:9 http://ftpmaster.internal/ubuntu resolute-proposed/main armhf Packages [134 kB] 40s Get:10 http://ftpmaster.internal/ubuntu resolute-proposed/restricted armhf Packages [940 B] 40s Get:11 http://ftpmaster.internal/ubuntu resolute-proposed/universe armhf Packages [474 kB] 40s Get:12 http://ftpmaster.internal/ubuntu resolute-proposed/multiverse armhf Packages [9684 B] 40s Get:13 http://ftpmaster.internal/ubuntu resolute/main Sources [1416 kB] 40s Get:14 http://ftpmaster.internal/ubuntu resolute/universe Sources [21.3 MB] 46s Get:15 http://ftpmaster.internal/ubuntu resolute/main armhf Packages [1369 kB] 47s Get:16 http://ftpmaster.internal/ubuntu resolute/universe armhf Packages [15.4 MB] 52s Fetched 41.2 MB in 13s (3171 kB/s) 53s Reading package lists... 59s autopkgtest [16:59:54]: upgrading testbed (apt dist-upgrade and autopurge) 60s Reading package lists... 61s Building dependency tree... 61s Reading state information... 61s Calculating upgrade... 61s The following packages will be upgraded: 61s apt libapt-pkg7.0 libcrypt1 usbutils 62s 4 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 62s Need to get 2774 kB of archives. 62s After this operation, 8192 B of additional disk space will be used. 62s Get:1 http://ftpmaster.internal/ubuntu resolute/main armhf libapt-pkg7.0 armhf 3.1.12 [1157 kB] 62s Get:2 http://ftpmaster.internal/ubuntu resolute/main armhf apt armhf 3.1.12 [1440 kB] 62s Get:3 http://ftpmaster.internal/ubuntu resolute/main armhf libcrypt1 armhf 1:4.5.1-1 [98.9 kB] 62s Get:4 http://ftpmaster.internal/ubuntu resolute/main armhf usbutils armhf 1:019-1 [77.7 kB] 63s Fetched 2774 kB in 1s (3247 kB/s) 63s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 65904 files and directories currently installed.) 63s Preparing to unpack .../libapt-pkg7.0_3.1.12_armhf.deb ... 63s Unpacking libapt-pkg7.0:armhf (3.1.12) over (3.1.11) ... 63s Setting up libapt-pkg7.0:armhf (3.1.12) ... 63s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 65904 files and directories currently installed.) 63s Preparing to unpack .../archives/apt_3.1.12_armhf.deb ... 63s Unpacking apt (3.1.12) over (3.1.11) ... 63s Setting up apt (3.1.12) ... 64s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 65904 files and directories currently installed.) 64s Preparing to unpack .../libcrypt1_1%3a4.5.1-1_armhf.deb ... 64s Unpacking libcrypt1:armhf (1:4.5.1-1) over (1:4.4.38-1build1) ... 64s Setting up libcrypt1:armhf (1:4.5.1-1) ... 64s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 65904 files and directories currently installed.) 64s Preparing to unpack .../usbutils_1%3a019-1_armhf.deb ... 64s Unpacking usbutils (1:019-1) over (1:018-2) ... 64s Setting up usbutils (1:019-1) ... 64s Processing triggers for man-db (2.13.1-1) ... 66s Processing triggers for libc-bin (2.42-2ubuntu2) ... 69s Reading package lists... 69s Building dependency tree... 69s Reading state information... 70s Solving dependencies... 70s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 72s autopkgtest [17:00:07]: rebooting testbed after setup commands that affected boot 113s autopkgtest [17:00:48]: testbed running kernel: Linux 6.8.0-86-generic #87~22.04.1-Ubuntu SMP PREEMPT_DYNAMIC Mon Sep 29 09:26:46 UTC 2 139s autopkgtest [17:01:14]: @@@@@@@@@@@@@@@@@@@@ apt-source intake 153s Get:1 http://ftpmaster.internal/ubuntu resolute-proposed/universe intake 0.6.6-4 (dsc) [2693 B] 153s Get:2 http://ftpmaster.internal/ubuntu resolute-proposed/universe intake 0.6.6-4 (tar) [4447 kB] 153s Get:3 http://ftpmaster.internal/ubuntu resolute-proposed/universe intake 0.6.6-4 (diff) [15.8 kB] 153s gpgv: Signature made Wed Aug 27 08:46:02 2025 UTC 153s gpgv: using RSA key 8F6DE104377F3B11E741748731F3144544A1741A 153s gpgv: issuer "tchet@debian.org" 153s gpgv: Can't check signature: No public key 153s dpkg-source: warning: cannot verify inline signature for ./intake_0.6.6-4.dsc: no acceptable signature found 153s autopkgtest [17:01:28]: testing package intake version 0.6.6-4 155s autopkgtest [17:01:30]: build not needed 160s autopkgtest [17:01:35]: test run-unit-test: preparing testbed 161s Reading package lists... 162s Building dependency tree... 162s Reading state information... 162s Solving dependencies... 162s The following NEW packages will be installed: 162s fonts-font-awesome fonts-glyphicons-halflings fonts-lato libblas3 162s libgfortran5 libjs-bootstrap libjs-jquery libjs-sphinxdoc libjs-underscore 162s liblapack3 node-html5shiv python3-aiohappyeyeballs python3-aiohttp 162s python3-aiosignal python3-all python3-async-timeout python3-click 162s python3-cloudpickle python3-dask python3-entrypoints python3-frozenlist 162s python3-fsspec python3-iniconfig python3-intake python3-intake-doc 162s python3-locket python3-msgpack python3-msgpack-numpy python3-multidict 162s python3-numpy python3-numpy-dev python3-pandas python3-pandas-lib 162s python3-partd python3-platformdirs python3-pluggy python3-propcache 162s python3-pytest python3-pytz python3-toolz python3-tornado python3-yarl 162s sphinx-rtd-theme-common 162s 0 upgraded, 43 newly installed, 0 to remove and 0 not upgraded. 162s Need to get 26.8 MB of archives. 162s After this operation, 121 MB of additional disk space will be used. 162s Get:1 http://ftpmaster.internal/ubuntu resolute/main armhf fonts-lato all 2.015-1 [2781 kB] 163s Get:2 http://ftpmaster.internal/ubuntu resolute/main armhf python3-numpy-dev armhf 1:2.2.4+ds-1ubuntu1 [141 kB] 163s Get:3 http://ftpmaster.internal/ubuntu resolute/main armhf libblas3 armhf 3.12.1-7 [133 kB] 163s Get:4 http://ftpmaster.internal/ubuntu resolute/main armhf libgfortran5 armhf 15.2.0-7ubuntu1 [334 kB] 163s Get:5 http://ftpmaster.internal/ubuntu resolute/main armhf liblapack3 armhf 3.12.1-7 [2091 kB] 163s Get:6 http://ftpmaster.internal/ubuntu resolute/main armhf python3-numpy armhf 1:2.2.4+ds-1ubuntu1 [3724 kB] 164s Get:7 http://ftpmaster.internal/ubuntu resolute/main armhf fonts-font-awesome all 5.0.10+really4.7.0~dfsg-4.1 [516 kB] 164s Get:8 http://ftpmaster.internal/ubuntu resolute/universe armhf fonts-glyphicons-halflings all 1.009~3.4.1+dfsg-6 [119 kB] 164s Get:9 http://ftpmaster.internal/ubuntu resolute/universe armhf libjs-bootstrap all 3.4.1+dfsg-6 [129 kB] 164s Get:10 http://ftpmaster.internal/ubuntu resolute/main armhf libjs-jquery all 3.6.1+dfsg+~3.5.14-1 [328 kB] 164s Get:11 http://ftpmaster.internal/ubuntu resolute/main armhf libjs-underscore all 1.13.4~dfsg+~1.11.4-3 [118 kB] 164s Get:12 http://ftpmaster.internal/ubuntu resolute/main armhf libjs-sphinxdoc all 8.2.3-1ubuntu2 [28.0 kB] 164s Get:13 http://ftpmaster.internal/ubuntu resolute/universe armhf node-html5shiv all 3.7.3+dfsg-5 [13.5 kB] 164s Get:14 http://ftpmaster.internal/ubuntu resolute/universe armhf python3-aiohappyeyeballs all 2.6.1-2 [11.1 kB] 164s Get:15 http://ftpmaster.internal/ubuntu resolute/universe armhf python3-multidict armhf 6.4.3-1build1 [67.0 kB] 164s Get:16 http://ftpmaster.internal/ubuntu resolute/universe armhf python3-propcache armhf 0.3.1-1build1 [50.5 kB] 164s Get:17 http://ftpmaster.internal/ubuntu resolute/universe armhf python3-yarl armhf 1.22.0-1 [97.6 kB] 164s Get:18 http://ftpmaster.internal/ubuntu resolute/universe armhf python3-async-timeout all 5.0.1-1 [6830 B] 164s Get:19 http://ftpmaster.internal/ubuntu resolute/universe armhf python3-frozenlist armhf 1.8.0-1 [53.5 kB] 164s Get:20 http://ftpmaster.internal/ubuntu resolute/universe armhf python3-aiosignal all 1.4.0-1 [5628 B] 164s Get:21 http://ftpmaster.internal/ubuntu resolute/universe armhf python3-aiohttp armhf 3.11.16-1 [342 kB] 164s Get:22 http://ftpmaster.internal/ubuntu resolute/main armhf python3-all armhf 3.13.7-1 [884 B] 164s Get:23 http://ftpmaster.internal/ubuntu resolute/main armhf python3-click all 8.2.0+0.really.8.1.8-1 [80.0 kB] 164s Get:24 http://ftpmaster.internal/ubuntu resolute/universe armhf python3-cloudpickle all 3.1.1-1 [22.4 kB] 164s Get:25 http://ftpmaster.internal/ubuntu resolute/universe armhf python3-fsspec all 2025.3.2-1ubuntu1 [217 kB] 164s Get:26 http://ftpmaster.internal/ubuntu resolute/universe armhf python3-toolz all 1.0.0-2 [45.0 kB] 164s Get:27 http://ftpmaster.internal/ubuntu resolute/universe armhf python3-locket all 1.0.0-2 [5872 B] 164s Get:28 http://ftpmaster.internal/ubuntu resolute/universe armhf python3-partd all 1.4.2-1 [15.7 kB] 164s Get:29 http://ftpmaster.internal/ubuntu resolute/universe armhf python3-dask all 2024.12.1+dfsg-2 [875 kB] 164s Get:30 http://ftpmaster.internal/ubuntu resolute/universe armhf python3-entrypoints all 0.4-3 [7174 B] 164s Get:31 http://ftpmaster.internal/ubuntu resolute/universe armhf python3-iniconfig all 2.1.0-1 [6840 B] 164s Get:32 http://ftpmaster.internal/ubuntu resolute/main armhf python3-msgpack armhf 1.0.3-3build5 [108 kB] 164s Get:33 http://ftpmaster.internal/ubuntu resolute/main armhf python3-platformdirs all 4.3.7-1 [16.9 kB] 164s Get:34 http://ftpmaster.internal/ubuntu resolute-proposed/universe armhf python3-intake armhf 0.6.6-4 [197 kB] 164s Get:35 http://ftpmaster.internal/ubuntu resolute/main armhf sphinx-rtd-theme-common all 3.0.2+dfsg-3 [1013 kB] 164s Get:36 http://ftpmaster.internal/ubuntu resolute-proposed/universe armhf python3-intake-doc all 0.6.6-4 [1549 kB] 164s Get:37 http://ftpmaster.internal/ubuntu resolute/universe armhf python3-msgpack-numpy all 0.4.8-1 [7388 B] 164s Get:38 http://ftpmaster.internal/ubuntu resolute/main armhf python3-pytz all 2025.2-4 [32.3 kB] 164s Get:39 http://ftpmaster.internal/ubuntu resolute/universe armhf python3-pandas-lib armhf 2.3.3+dfsg-1ubuntu1 [8020 kB] 165s Get:40 http://ftpmaster.internal/ubuntu resolute/universe armhf python3-pandas all 2.3.3+dfsg-1ubuntu1 [2948 kB] 165s Get:41 http://ftpmaster.internal/ubuntu resolute/universe armhf python3-pluggy all 1.6.0-1 [21.0 kB] 165s Get:42 http://ftpmaster.internal/ubuntu resolute/universe armhf python3-pytest all 8.3.5-2 [252 kB] 165s Get:43 http://ftpmaster.internal/ubuntu resolute/main armhf python3-tornado armhf 6.5.2-3 [304 kB] 165s Fetched 26.8 MB in 3s (10.5 MB/s) 165s Selecting previously unselected package fonts-lato. 165s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 65904 files and directories currently installed.) 165s Preparing to unpack .../00-fonts-lato_2.015-1_all.deb ... 165s Unpacking fonts-lato (2.015-1) ... 166s Selecting previously unselected package python3-numpy-dev:armhf. 166s Preparing to unpack .../01-python3-numpy-dev_1%3a2.2.4+ds-1ubuntu1_armhf.deb ... 166s Unpacking python3-numpy-dev:armhf (1:2.2.4+ds-1ubuntu1) ... 166s Selecting previously unselected package libblas3:armhf. 166s Preparing to unpack .../02-libblas3_3.12.1-7_armhf.deb ... 166s Unpacking libblas3:armhf (3.12.1-7) ... 166s Selecting previously unselected package libgfortran5:armhf. 166s Preparing to unpack .../03-libgfortran5_15.2.0-7ubuntu1_armhf.deb ... 166s Unpacking libgfortran5:armhf (15.2.0-7ubuntu1) ... 166s Selecting previously unselected package liblapack3:armhf. 166s Preparing to unpack .../04-liblapack3_3.12.1-7_armhf.deb ... 166s Unpacking liblapack3:armhf (3.12.1-7) ... 166s Selecting previously unselected package python3-numpy. 166s Preparing to unpack .../05-python3-numpy_1%3a2.2.4+ds-1ubuntu1_armhf.deb ... 166s Unpacking python3-numpy (1:2.2.4+ds-1ubuntu1) ... 166s Selecting previously unselected package fonts-font-awesome. 166s Preparing to unpack .../06-fonts-font-awesome_5.0.10+really4.7.0~dfsg-4.1_all.deb ... 166s Unpacking fonts-font-awesome (5.0.10+really4.7.0~dfsg-4.1) ... 166s Selecting previously unselected package fonts-glyphicons-halflings. 166s Preparing to unpack .../07-fonts-glyphicons-halflings_1.009~3.4.1+dfsg-6_all.deb ... 166s Unpacking fonts-glyphicons-halflings (1.009~3.4.1+dfsg-6) ... 166s Selecting previously unselected package libjs-bootstrap. 166s Preparing to unpack .../08-libjs-bootstrap_3.4.1+dfsg-6_all.deb ... 166s Unpacking libjs-bootstrap (3.4.1+dfsg-6) ... 166s Selecting previously unselected package libjs-jquery. 166s Preparing to unpack .../09-libjs-jquery_3.6.1+dfsg+~3.5.14-1_all.deb ... 166s Unpacking libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 166s Selecting previously unselected package libjs-underscore. 166s Preparing to unpack .../10-libjs-underscore_1.13.4~dfsg+~1.11.4-3_all.deb ... 166s Unpacking libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... 166s Selecting previously unselected package libjs-sphinxdoc. 166s Preparing to unpack .../11-libjs-sphinxdoc_8.2.3-1ubuntu2_all.deb ... 166s Unpacking libjs-sphinxdoc (8.2.3-1ubuntu2) ... 166s Selecting previously unselected package node-html5shiv. 166s Preparing to unpack .../12-node-html5shiv_3.7.3+dfsg-5_all.deb ... 166s Unpacking node-html5shiv (3.7.3+dfsg-5) ... 166s Selecting previously unselected package python3-aiohappyeyeballs. 166s Preparing to unpack .../13-python3-aiohappyeyeballs_2.6.1-2_all.deb ... 166s Unpacking python3-aiohappyeyeballs (2.6.1-2) ... 166s Selecting previously unselected package python3-multidict. 166s Preparing to unpack .../14-python3-multidict_6.4.3-1build1_armhf.deb ... 166s Unpacking python3-multidict (6.4.3-1build1) ... 167s Selecting previously unselected package python3-propcache. 167s Preparing to unpack .../15-python3-propcache_0.3.1-1build1_armhf.deb ... 167s Unpacking python3-propcache (0.3.1-1build1) ... 167s Selecting previously unselected package python3-yarl. 167s Preparing to unpack .../16-python3-yarl_1.22.0-1_armhf.deb ... 167s Unpacking python3-yarl (1.22.0-1) ... 167s Selecting previously unselected package python3-async-timeout. 167s Preparing to unpack .../17-python3-async-timeout_5.0.1-1_all.deb ... 167s Unpacking python3-async-timeout (5.0.1-1) ... 167s Selecting previously unselected package python3-frozenlist. 167s Preparing to unpack .../18-python3-frozenlist_1.8.0-1_armhf.deb ... 167s Unpacking python3-frozenlist (1.8.0-1) ... 167s Selecting previously unselected package python3-aiosignal. 167s Preparing to unpack .../19-python3-aiosignal_1.4.0-1_all.deb ... 167s Unpacking python3-aiosignal (1.4.0-1) ... 167s Selecting previously unselected package python3-aiohttp. 167s Preparing to unpack .../20-python3-aiohttp_3.11.16-1_armhf.deb ... 167s Unpacking python3-aiohttp (3.11.16-1) ... 167s Selecting previously unselected package python3-all. 167s Preparing to unpack .../21-python3-all_3.13.7-1_armhf.deb ... 167s Unpacking python3-all (3.13.7-1) ... 167s Selecting previously unselected package python3-click. 167s Preparing to unpack .../22-python3-click_8.2.0+0.really.8.1.8-1_all.deb ... 167s Unpacking python3-click (8.2.0+0.really.8.1.8-1) ... 167s Selecting previously unselected package python3-cloudpickle. 167s Preparing to unpack .../23-python3-cloudpickle_3.1.1-1_all.deb ... 167s Unpacking python3-cloudpickle (3.1.1-1) ... 167s Selecting previously unselected package python3-fsspec. 167s Preparing to unpack .../24-python3-fsspec_2025.3.2-1ubuntu1_all.deb ... 167s Unpacking python3-fsspec (2025.3.2-1ubuntu1) ... 167s Selecting previously unselected package python3-toolz. 167s Preparing to unpack .../25-python3-toolz_1.0.0-2_all.deb ... 167s Unpacking python3-toolz (1.0.0-2) ... 167s Selecting previously unselected package python3-locket. 167s Preparing to unpack .../26-python3-locket_1.0.0-2_all.deb ... 167s Unpacking python3-locket (1.0.0-2) ... 167s Selecting previously unselected package python3-partd. 167s Preparing to unpack .../27-python3-partd_1.4.2-1_all.deb ... 167s Unpacking python3-partd (1.4.2-1) ... 167s Selecting previously unselected package python3-dask. 167s Preparing to unpack .../28-python3-dask_2024.12.1+dfsg-2_all.deb ... 167s Unpacking python3-dask (2024.12.1+dfsg-2) ... 167s Selecting previously unselected package python3-entrypoints. 167s Preparing to unpack .../29-python3-entrypoints_0.4-3_all.deb ... 167s Unpacking python3-entrypoints (0.4-3) ... 167s Selecting previously unselected package python3-iniconfig. 167s Preparing to unpack .../30-python3-iniconfig_2.1.0-1_all.deb ... 167s Unpacking python3-iniconfig (2.1.0-1) ... 167s Selecting previously unselected package python3-msgpack. 167s Preparing to unpack .../31-python3-msgpack_1.0.3-3build5_armhf.deb ... 167s Unpacking python3-msgpack (1.0.3-3build5) ... 167s Selecting previously unselected package python3-platformdirs. 167s Preparing to unpack .../32-python3-platformdirs_4.3.7-1_all.deb ... 167s Unpacking python3-platformdirs (4.3.7-1) ... 167s Selecting previously unselected package python3-intake. 167s Preparing to unpack .../33-python3-intake_0.6.6-4_armhf.deb ... 167s Unpacking python3-intake (0.6.6-4) ... 167s Selecting previously unselected package sphinx-rtd-theme-common. 167s Preparing to unpack .../34-sphinx-rtd-theme-common_3.0.2+dfsg-3_all.deb ... 167s Unpacking sphinx-rtd-theme-common (3.0.2+dfsg-3) ... 167s Selecting previously unselected package python3-intake-doc. 167s Preparing to unpack .../35-python3-intake-doc_0.6.6-4_all.deb ... 167s Unpacking python3-intake-doc (0.6.6-4) ... 167s Selecting previously unselected package python3-msgpack-numpy. 167s Preparing to unpack .../36-python3-msgpack-numpy_0.4.8-1_all.deb ... 167s Unpacking python3-msgpack-numpy (0.4.8-1) ... 167s Selecting previously unselected package python3-pytz. 167s Preparing to unpack .../37-python3-pytz_2025.2-4_all.deb ... 167s Unpacking python3-pytz (2025.2-4) ... 167s Selecting previously unselected package python3-pandas-lib:armhf. 167s Preparing to unpack .../38-python3-pandas-lib_2.3.3+dfsg-1ubuntu1_armhf.deb ... 167s Unpacking python3-pandas-lib:armhf (2.3.3+dfsg-1ubuntu1) ... 168s Selecting previously unselected package python3-pandas. 168s Preparing to unpack .../39-python3-pandas_2.3.3+dfsg-1ubuntu1_all.deb ... 168s Unpacking python3-pandas (2.3.3+dfsg-1ubuntu1) ... 168s Selecting previously unselected package python3-pluggy. 168s Preparing to unpack .../40-python3-pluggy_1.6.0-1_all.deb ... 168s Unpacking python3-pluggy (1.6.0-1) ... 168s Selecting previously unselected package python3-pytest. 168s Preparing to unpack .../41-python3-pytest_8.3.5-2_all.deb ... 168s Unpacking python3-pytest (8.3.5-2) ... 168s Selecting previously unselected package python3-tornado. 168s Preparing to unpack .../42-python3-tornado_6.5.2-3_armhf.deb ... 168s Unpacking python3-tornado (6.5.2-3) ... 168s Setting up python3-entrypoints (0.4-3) ... 168s Setting up python3-iniconfig (2.1.0-1) ... 168s Setting up python3-tornado (6.5.2-3) ... 169s Setting up fonts-lato (2.015-1) ... 169s Setting up fonts-glyphicons-halflings (1.009~3.4.1+dfsg-6) ... 169s Setting up python3-fsspec (2025.3.2-1ubuntu1) ... 170s Setting up node-html5shiv (3.7.3+dfsg-5) ... 170s Setting up python3-all (3.13.7-1) ... 170s Setting up python3-pytz (2025.2-4) ... 170s Setting up python3-click (8.2.0+0.really.8.1.8-1) ... 170s Setting up python3-platformdirs (4.3.7-1) ... 170s Setting up python3-multidict (6.4.3-1build1) ... 170s Setting up python3-cloudpickle (3.1.1-1) ... 170s Setting up python3-frozenlist (1.8.0-1) ... 171s Setting up python3-aiosignal (1.4.0-1) ... 171s Setting up python3-async-timeout (5.0.1-1) ... 171s Setting up libblas3:armhf (3.12.1-7) ... 171s update-alternatives: using /usr/lib/arm-linux-gnueabihf/blas/libblas.so.3 to provide /usr/lib/arm-linux-gnueabihf/libblas.so.3 (libblas.so.3-arm-linux-gnueabihf) in auto mode 171s Setting up python3-numpy-dev:armhf (1:2.2.4+ds-1ubuntu1) ... 171s Setting up python3-aiohappyeyeballs (2.6.1-2) ... 171s Setting up libgfortran5:armhf (15.2.0-7ubuntu1) ... 171s Setting up python3-pluggy (1.6.0-1) ... 171s Setting up python3-propcache (0.3.1-1build1) ... 171s Setting up python3-toolz (1.0.0-2) ... 172s Setting up python3-msgpack (1.0.3-3build5) ... 172s Setting up libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 172s Setting up python3-locket (1.0.0-2) ... 172s Setting up python3-yarl (1.22.0-1) ... 172s Setting up fonts-font-awesome (5.0.10+really4.7.0~dfsg-4.1) ... 172s Setting up sphinx-rtd-theme-common (3.0.2+dfsg-3) ... 172s Setting up libjs-bootstrap (3.4.1+dfsg-6) ... 172s Setting up libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... 172s Setting up python3-partd (1.4.2-1) ... 173s Setting up liblapack3:armhf (3.12.1-7) ... 173s update-alternatives: using /usr/lib/arm-linux-gnueabihf/lapack/liblapack.so.3 to provide /usr/lib/arm-linux-gnueabihf/liblapack.so.3 (liblapack.so.3-arm-linux-gnueabihf) in auto mode 173s Setting up python3-pytest (8.3.5-2) ... 173s Setting up python3-aiohttp (3.11.16-1) ... 174s Setting up python3-dask (2024.12.1+dfsg-2) ... 176s Setting up python3-numpy (1:2.2.4+ds-1ubuntu1) ... 178s Setting up libjs-sphinxdoc (8.2.3-1ubuntu2) ... 178s Setting up python3-intake (0.6.6-4) ... 178s Setting up python3-msgpack-numpy (0.4.8-1) ... 178s Setting up python3-pandas-lib:armhf (2.3.3+dfsg-1ubuntu1) ... 178s Setting up python3-intake-doc (0.6.6-4) ... 178s Setting up python3-pandas (2.3.3+dfsg-1ubuntu1) ... 184s Processing triggers for man-db (2.13.1-1) ... 184s Processing triggers for libc-bin (2.42-2ubuntu2) ... 194s autopkgtest [17:02:09]: test run-unit-test: [----------------------- 196s ============================= test session starts ============================== 196s platform linux -- Python 3.13.9, pytest-8.3.5, pluggy-1.6.0 -- /usr/bin/python3.13 196s cachedir: .pytest_cache 196s rootdir: /tmp/autopkgtest.kHxhri/build.SqR/src 196s plugins: typeguard-4.4.2 198s collecting ... collected 424 items / 11 skipped 198s 198s intake/auth/tests/test_auth.py::test_get PASSED [ 0%] 198s intake/auth/tests/test_auth.py::test_base PASSED [ 0%] 198s intake/auth/tests/test_auth.py::test_base_client PASSED [ 0%] 198s intake/auth/tests/test_auth.py::test_base_get_case_insensitive PASSED [ 0%] 198s intake/auth/tests/test_auth.py::test_secret PASSED [ 1%] 198s intake/auth/tests/test_auth.py::test_secret_client PASSED [ 1%] 198s intake/catalog/tests/test_alias.py::test_simple PASSED [ 1%] 198s intake/catalog/tests/test_alias.py::test_mapping PASSED [ 1%] 202s intake/catalog/tests/test_auth_integration.py::test_secret_auth PASSED [ 2%] 205s intake/catalog/tests/test_auth_integration.py::test_secret_auth_fail PASSED [ 2%] 205s intake/catalog/tests/test_caching_integration.py::test_load_csv PASSED [ 2%] 205s intake/catalog/tests/test_caching_integration.py::test_list_of_files PASSED [ 2%] 205s intake/catalog/tests/test_caching_integration.py::test_bad_type_cache PASSED [ 3%] 205s intake/catalog/tests/test_caching_integration.py::test_load_textfile FAILED [ 3%] 205s intake/catalog/tests/test_caching_integration.py::test_load_arr PASSED [ 3%] 205s intake/catalog/tests/test_caching_integration.py::test_regex[test_no_regex] PASSED [ 3%] 205s intake/catalog/tests/test_caching_integration.py::test_regex[test_regex_no_match] PASSED [ 4%] 205s intake/catalog/tests/test_caching_integration.py::test_regex[test_regex_partial_match] PASSED [ 4%] 205s intake/catalog/tests/test_caching_integration.py::test_get_metadata PASSED [ 4%] 205s intake/catalog/tests/test_caching_integration.py::test_clear_cache PASSED [ 4%] 205s intake/catalog/tests/test_caching_integration.py::test_clear_cache_bad_metadata PASSED [ 4%] 205s intake/catalog/tests/test_caching_integration.py::test_clear_all PASSED [ 5%] 205s intake/catalog/tests/test_caching_integration.py::test_second_load PASSED [ 5%] 206s intake/catalog/tests/test_caching_integration.py::test_second_load_timestamp PASSED [ 5%] 206s intake/catalog/tests/test_caching_integration.py::test_second_load_refresh PASSED [ 5%] 206s intake/catalog/tests/test_caching_integration.py::test_multiple_cache PASSED [ 6%] 206s intake/catalog/tests/test_caching_integration.py::test_disable_caching PASSED [ 6%] 206s intake/catalog/tests/test_caching_integration.py::test_ds_set_cache_dir PASSED [ 6%] 206s intake/catalog/tests/test_catalog_save.py::test_catalog_description PASSED [ 6%] 206s intake/catalog/tests/test_core.py::test_no_entry PASSED [ 7%] 206s intake/catalog/tests/test_core.py::test_regression PASSED [ 7%] 206s intake/catalog/tests/test_default.py::test_load PASSED [ 7%] 206s intake/catalog/tests/test_discovery.py::test_catalog_discovery PASSED [ 7%] 206s intake/catalog/tests/test_discovery.py::test_deferred_import PASSED [ 8%] 206s intake/catalog/tests/test_gui.py::test_cat_no_panel_does_not_raise_errors PASSED [ 8%] 206s intake/catalog/tests/test_gui.py::test_cat_no_panel_display_gui PASSED [ 8%] 206s intake/catalog/tests/test_gui.py::test_cat_gui SKIPPED (could not im...) [ 8%] 206s intake/catalog/tests/test_gui.py::test_entry_no_panel_does_not_raise_errors PASSED [ 8%] 206s intake/catalog/tests/test_gui.py::test_entry_no_panel_display_gui PASSED [ 9%] 206s intake/catalog/tests/test_gui.py::test_entry_gui SKIPPED (could not ...) [ 9%] 206s intake/catalog/tests/test_local.py::test_local_catalog PASSED [ 9%] 206s intake/catalog/tests/test_local.py::test_get_items PASSED [ 9%] 206s intake/catalog/tests/test_local.py::test_nested FAILED [ 10%] 206s intake/catalog/tests/test_local.py::test_nested_gets_name_from_super PASSED [ 10%] 206s intake/catalog/tests/test_local.py::test_hash PASSED [ 10%] 206s intake/catalog/tests/test_local.py::test_getitem PASSED [ 10%] 206s intake/catalog/tests/test_local.py::test_source_plugin_config PASSED [ 11%] 206s intake/catalog/tests/test_local.py::test_metadata PASSED [ 11%] 206s intake/catalog/tests/test_local.py::test_use_source_plugin_from_config PASSED [ 11%] 206s intake/catalog/tests/test_local.py::test_get_dir PASSED [ 11%] 206s intake/catalog/tests/test_local.py::test_entry_dir_function PASSED [ 12%] 206s intake/catalog/tests/test_local.py::test_user_parameter_default_value[bool-False] PASSED [ 12%] 206s intake/catalog/tests/test_local.py::test_user_parameter_default_value[datetime-expected1] PASSED [ 12%] 206s intake/catalog/tests/test_local.py::test_user_parameter_default_value[float-0.0] PASSED [ 12%] 206s intake/catalog/tests/test_local.py::test_user_parameter_default_value[int-0] PASSED [ 12%] 206s intake/catalog/tests/test_local.py::test_user_parameter_default_value[list-expected4] PASSED [ 13%] 206s intake/catalog/tests/test_local.py::test_user_parameter_default_value[str-] PASSED [ 13%] 206s intake/catalog/tests/test_local.py::test_user_parameter_default_value[unicode-] PASSED [ 13%] 206s intake/catalog/tests/test_local.py::test_user_parameter_repr PASSED [ 13%] 206s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[bool-true-True] PASSED [ 14%] 206s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[bool-0-False] PASSED [ 14%] 206s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[datetime-given2-expected2] PASSED [ 14%] 206s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[datetime-2018-01-01 12:34AM-expected3] PASSED [ 14%] 206s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[datetime-1234567890000000000-expected4] PASSED [ 15%] 206s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[float-3.14-3.14] PASSED [ 15%] 206s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[int-1-1] PASSED [ 15%] 206s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[list-given7-expected7] PASSED [ 15%] 206s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[str-1-1] PASSED [ 16%] 206s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[unicode-foo-foo] PASSED [ 16%] 206s intake/catalog/tests/test_local.py::test_user_parameter_coerce_special_datetime[now] PASSED [ 16%] 206s intake/catalog/tests/test_local.py::test_user_parameter_coerce_special_datetime[today] PASSED [ 16%] 206s intake/catalog/tests/test_local.py::test_user_parameter_coerce_min[float-100.0-100.0] PASSED [ 16%] 206s intake/catalog/tests/test_local.py::test_user_parameter_coerce_min[int-20-20] PASSED [ 17%] 206s intake/catalog/tests/test_local.py::test_user_parameter_coerce_min[int-20.0-20] PASSED [ 17%] 206s intake/catalog/tests/test_local.py::test_user_parameter_coerce_max[float-100.0-100.0] PASSED [ 17%] 206s intake/catalog/tests/test_local.py::test_user_parameter_coerce_max[int-20-20] PASSED [ 17%] 206s intake/catalog/tests/test_local.py::test_user_parameter_coerce_max[int-20.0-20] PASSED [ 18%] 206s intake/catalog/tests/test_local.py::test_user_parameter_coerce_allowed[float-given0-expected0] PASSED [ 18%] 206s intake/catalog/tests/test_local.py::test_user_parameter_coerce_allowed[int-given1-expected1] PASSED [ 18%] 206s intake/catalog/tests/test_local.py::test_user_parameter_validation_range PASSED [ 18%] 206s intake/catalog/tests/test_local.py::test_user_parameter_validation_allowed PASSED [ 19%] 206s intake/catalog/tests/test_local.py::test_user_pars_list PASSED [ 19%] 206s intake/catalog/tests/test_local.py::test_user_pars_mlist PASSED [ 19%] 206s intake/catalog/tests/test_local.py::test_parser_validation_error[catalog_non_dict] PASSED [ 19%] 206s intake/catalog/tests/test_local.py::test_parser_validation_error[data_source_missing] PASSED [ 20%] 206s intake/catalog/tests/test_local.py::test_parser_validation_error[data_source_name_non_string] PASSED [ 20%] 206s intake/catalog/tests/test_local.py::test_parser_validation_error[data_source_non_dict] PASSED [ 20%] 206s intake/catalog/tests/test_local.py::test_parser_validation_error[data_source_value_non_dict] PASSED [ 20%] 206s intake/catalog/tests/test_local.py::test_parser_validation_error[params_missing_required] PASSED [ 20%] 206s intake/catalog/tests/test_local.py::test_parser_validation_error[params_name_non_string] PASSED [ 21%] 206s intake/catalog/tests/test_local.py::test_parser_validation_error[params_non_dict] PASSED [ 21%] 206s intake/catalog/tests/test_local.py::test_parser_validation_error[params_value_bad_choice] PASSED [ 21%] 206s intake/catalog/tests/test_local.py::test_parser_validation_error[params_value_bad_type] PASSED [ 21%] 206s intake/catalog/tests/test_local.py::test_parser_validation_error[params_value_non_dict] PASSED [ 22%] 206s intake/catalog/tests/test_local.py::test_parser_validation_error[plugins_non_dict] PASSED [ 22%] 206s intake/catalog/tests/test_local.py::test_parser_validation_error[plugins_source_missing] PASSED [ 22%] 206s intake/catalog/tests/test_local.py::test_parser_validation_error[plugins_source_missing_key] PASSED [ 22%] 206s intake/catalog/tests/test_local.py::test_parser_validation_error[plugins_source_non_dict] PASSED [ 23%] 206s intake/catalog/tests/test_local.py::test_parser_validation_error[plugins_source_non_list] PASSED [ 23%] 206s intake/catalog/tests/test_local.py::test_parser_obsolete_error[obsolete_data_source_list] PASSED [ 23%] 206s intake/catalog/tests/test_local.py::test_parser_obsolete_error[obsolete_params_list] PASSED [ 23%] 206s intake/catalog/tests/test_local.py::test_union_catalog PASSED [ 24%] 206s intake/catalog/tests/test_local.py::test_persist_local_cat PASSED [ 24%] 206s intake/catalog/tests/test_local.py::test_empty_catalog PASSED [ 24%] 206s intake/catalog/tests/test_local.py::test_nonexistent_error PASSED [ 24%] 206s intake/catalog/tests/test_local.py::test_duplicate_data_sources PASSED [ 25%] 206s intake/catalog/tests/test_local.py::test_duplicate_parameters PASSED [ 25%] 207s intake/catalog/tests/test_local.py::test_catalog_file_removal PASSED [ 25%] 207s intake/catalog/tests/test_local.py::test_flatten_duplicate_error PASSED [ 25%] 207s intake/catalog/tests/test_local.py::test_multi_cat_names PASSED [ 25%] 207s intake/catalog/tests/test_local.py::test_name_of_builtin PASSED [ 26%] 207s intake/catalog/tests/test_local.py::test_cat_with_declared_name PASSED [ 26%] 207s intake/catalog/tests/test_local.py::test_cat_with_no_declared_name_gets_name_from_dir_if_file_named_catalog PASSED [ 26%] 207s intake/catalog/tests/test_local.py::test_default_expansions PASSED [ 26%] 207s intake/catalog/tests/test_local.py::test_remote_cat PASSED [ 27%] 207s intake/catalog/tests/test_local.py::test_multi_plugins PASSED [ 27%] 207s intake/catalog/tests/test_local.py::test_no_plugins PASSED [ 27%] 207s intake/catalog/tests/test_local.py::test_explicit_entry_driver PASSED [ 27%] 207s intake/catalog/tests/test_local.py::test_getitem_and_getattr PASSED [ 28%] 207s intake/catalog/tests/test_local.py::test_dot_names PASSED [ 28%] 207s intake/catalog/tests/test_local.py::test_listing PASSED [ 28%] 207s intake/catalog/tests/test_local.py::test_dict_save PASSED [ 28%] 207s intake/catalog/tests/test_local.py::test_dict_save_complex PASSED [ 29%] 207s intake/catalog/tests/test_local.py::test_dict_adddel PASSED [ 29%] 207s intake/catalog/tests/test_local.py::test_filter PASSED [ 29%] 207s intake/catalog/tests/test_local.py::test_from_dict_with_data_source PASSED [ 29%] 207s intake/catalog/tests/test_local.py::test_no_instance PASSED [ 29%] 207s intake/catalog/tests/test_local.py::test_fsspec_integration PASSED [ 30%] 207s intake/catalog/tests/test_local.py::test_cat_add PASSED [ 30%] 207s intake/catalog/tests/test_local.py::test_no_entries_items PASSED [ 30%] 207s intake/catalog/tests/test_local.py::test_cat_dictlike PASSED [ 30%] 207s intake/catalog/tests/test_local.py::test_inherit_params SKIPPED (tes...) [ 31%] 207s intake/catalog/tests/test_local.py::test_runtime_overwrite_params SKIPPED [ 31%] 207s intake/catalog/tests/test_local.py::test_local_param_overwrites SKIPPED [ 31%] 207s intake/catalog/tests/test_local.py::test_local_and_global_params SKIPPED [ 31%] 207s intake/catalog/tests/test_local.py::test_search_inherit_params SKIPPED [ 32%] 207s intake/catalog/tests/test_local.py::test_multiple_cats_params SKIPPED [ 32%] 207s intake/catalog/tests/test_parameters.py::test_simplest PASSED [ 32%] 207s intake/catalog/tests/test_parameters.py::test_cache_default_source PASSED [ 32%] 207s intake/catalog/tests/test_parameters.py::test_parameter_default PASSED [ 33%] 207s intake/catalog/tests/test_parameters.py::test_maybe_default_from_env PASSED [ 33%] 207s intake/catalog/tests/test_parameters.py::test_up_override_and_render PASSED [ 33%] 207s intake/catalog/tests/test_parameters.py::test_user_explicit_override PASSED [ 33%] 207s intake/catalog/tests/test_parameters.py::test_auto_env_expansion PASSED [ 33%] 207s intake/catalog/tests/test_parameters.py::test_validate_up PASSED [ 34%] 207s intake/catalog/tests/test_parameters.py::test_validate_par PASSED [ 34%] 207s intake/catalog/tests/test_parameters.py::test_mlist_parameter PASSED [ 34%] 207s intake/catalog/tests/test_parameters.py::test_explicit_overrides PASSED [ 34%] 207s intake/catalog/tests/test_parameters.py::test_extra_arg PASSED [ 35%] 207s intake/catalog/tests/test_parameters.py::test_unknown PASSED [ 35%] 207s intake/catalog/tests/test_parameters.py::test_catalog_passthrough PASSED [ 35%] 207s intake/catalog/tests/test_persist.py::test_idempotent SKIPPED (could...) [ 35%] 207s intake/catalog/tests/test_persist.py::test_parquet SKIPPED (could no...) [ 36%] 210s intake/catalog/tests/test_reload_integration.py::test_reload_updated_config PASSED [ 36%] 213s intake/catalog/tests/test_reload_integration.py::test_reload_updated_directory PASSED [ 36%] 215s intake/catalog/tests/test_reload_integration.py::test_reload_missing_remote_directory PASSED [ 36%] 217s intake/catalog/tests/test_reload_integration.py::test_reload_missing_local_directory PASSED [ 37%] 218s intake/catalog/tests/test_remote_integration.py::test_info_describe FAILED [ 37%] 218s intake/catalog/tests/test_remote_integration.py::test_bad_url PASSED [ 37%] 218s intake/catalog/tests/test_remote_integration.py::test_metadata PASSED [ 37%] 218s intake/catalog/tests/test_remote_integration.py::test_nested_remote PASSED [ 37%] 218s intake/catalog/tests/test_remote_integration.py::test_remote_direct FAILED [ 38%] 218s intake/catalog/tests/test_remote_integration.py::test_entry_metadata PASSED [ 38%] 218s intake/catalog/tests/test_remote_integration.py::test_unknown_source PASSED [ 38%] 218s intake/catalog/tests/test_remote_integration.py::test_remote_datasource_interface FAILED [ 38%] 218s intake/catalog/tests/test_remote_integration.py::test_environment_evaluation PASSED [ 39%] 218s intake/catalog/tests/test_remote_integration.py::test_read FAILED [ 39%] 218s intake/catalog/tests/test_remote_integration.py::test_read_direct PASSED [ 39%] 219s intake/catalog/tests/test_remote_integration.py::test_read_chunks FAILED [ 39%] 219s intake/catalog/tests/test_remote_integration.py::test_read_partition FAILED [ 40%] 219s intake/catalog/tests/test_remote_integration.py::test_close FAILED [ 40%] 219s intake/catalog/tests/test_remote_integration.py::test_with FAILED [ 40%] 219s intake/catalog/tests/test_remote_integration.py::test_pickle FAILED [ 40%] 219s intake/catalog/tests/test_remote_integration.py::test_to_dask FAILED [ 41%] 219s intake/catalog/tests/test_remote_integration.py::test_remote_env PASSED [ 41%] 219s intake/catalog/tests/test_remote_integration.py::test_remote_sequence FAILED [ 41%] 220s intake/catalog/tests/test_remote_integration.py::test_remote_arr PASSED [ 41%] 220s intake/catalog/tests/test_remote_integration.py::test_pagination PASSED [ 41%] 220s intake/catalog/tests/test_remote_integration.py::test_dir FAILED [ 42%] 220s intake/catalog/tests/test_remote_integration.py::test_getitem_and_getattr PASSED [ 42%] 220s intake/catalog/tests/test_remote_integration.py::test_search PASSED [ 42%] 220s intake/catalog/tests/test_remote_integration.py::test_access_subcatalog PASSED [ 42%] 220s intake/catalog/tests/test_remote_integration.py::test_len PASSED [ 43%] 221s intake/catalog/tests/test_remote_integration.py::test_datetime PASSED [ 43%] 221s intake/catalog/tests/test_utils.py::test_expand_templates PASSED [ 43%] 221s intake/catalog/tests/test_utils.py::test_expand_nested_template PASSED [ 43%] 221s intake/catalog/tests/test_utils.py::test_coerce_datetime[None-expected0] PASSED [ 44%] 221s intake/catalog/tests/test_utils.py::test_coerce_datetime[1-expected1] PASSED [ 44%] 221s intake/catalog/tests/test_utils.py::test_coerce_datetime[1988-02-24T13:37+0100-expected2] PASSED [ 44%] 221s intake/catalog/tests/test_utils.py::test_coerce_datetime[test_input3-expected3] PASSED [ 44%] 221s intake/catalog/tests/test_utils.py::test_flatten PASSED [ 45%] 221s intake/catalog/tests/test_utils.py::test_coerce[1-int-1_0] PASSED [ 45%] 221s intake/catalog/tests/test_utils.py::test_coerce[1-int-1_1] PASSED [ 45%] 221s intake/catalog/tests/test_utils.py::test_coerce[1-str-1] PASSED [ 45%] 221s intake/catalog/tests/test_utils.py::test_coerce[value3-list-expected3] PASSED [ 45%] 221s intake/catalog/tests/test_utils.py::test_coerce[value4-list-expected4] PASSED [ 46%] 221s intake/catalog/tests/test_utils.py::test_coerce[value5-list[str]-expected5] PASSED [ 46%] 222s intake/cli/client/tests/test_cache.py::test_help PASSED [ 46%] 222s intake/cli/client/tests/test_cache.py::test_list_keys PASSED [ 46%] 224s intake/cli/client/tests/test_cache.py::test_precache PASSED [ 47%] 224s intake/cli/client/tests/test_cache.py::test_clear_all PASSED [ 47%] 224s intake/cli/client/tests/test_cache.py::test_clear_one PASSED [ 47%] 225s intake/cli/client/tests/test_cache.py::test_usage PASSED [ 47%] 225s intake/cli/client/tests/test_conf.py::test_reset PASSED [ 48%] 225s intake/cli/client/tests/test_conf.py::test_info PASSED [ 48%] 226s intake/cli/client/tests/test_conf.py::test_defaults PASSED [ 48%] 226s intake/cli/client/tests/test_conf.py::test_get PASSED [ 48%] 226s intake/cli/client/tests/test_conf.py::test_log_level PASSED [ 49%] 227s intake/cli/client/tests/test_local_integration.py::test_list PASSED [ 49%] 227s intake/cli/client/tests/test_local_integration.py::test_full_list PASSED [ 49%] 227s intake/cli/client/tests/test_local_integration.py::test_describe PASSED [ 49%] 228s intake/cli/client/tests/test_local_integration.py::test_exists_pass PASSED [ 50%] 228s intake/cli/client/tests/test_local_integration.py::test_exists_fail PASSED [ 50%] 229s intake/cli/client/tests/test_local_integration.py::test_discover FAILED [ 50%] 230s intake/cli/client/tests/test_local_integration.py::test_get_pass FAILED [ 50%] 230s intake/cli/client/tests/test_local_integration.py::test_get_fail PASSED [ 50%] 231s intake/cli/client/tests/test_local_integration.py::test_example PASSED [ 51%] 231s intake/cli/server/tests/test_serializer.py::test_dataframe[ser0] SKIPPED [ 51%] 231s intake/cli/server/tests/test_serializer.py::test_dataframe[ser1] SKIPPED [ 51%] 231s intake/cli/server/tests/test_serializer.py::test_dataframe[ser2] SKIPPED [ 51%] 231s intake/cli/server/tests/test_serializer.py::test_ndarray[ser0] PASSED [ 52%] 231s intake/cli/server/tests/test_serializer.py::test_ndarray[ser1] PASSED [ 52%] 231s intake/cli/server/tests/test_serializer.py::test_ndarray[ser2] PASSED [ 52%] 231s intake/cli/server/tests/test_serializer.py::test_python[ser0] PASSED [ 52%] 231s intake/cli/server/tests/test_serializer.py::test_python[ser1] PASSED [ 53%] 231s intake/cli/server/tests/test_serializer.py::test_python[ser2] PASSED [ 53%] 231s intake/cli/server/tests/test_serializer.py::test_compression_roundtrip[comp0] PASSED [ 53%] 231s intake/cli/server/tests/test_serializer.py::test_compression_roundtrip[comp1] PASSED [ 53%] 231s intake/cli/server/tests/test_serializer.py::test_none_compress PASSED [ 54%] 231s intake/cli/server/tests/test_server.py::TestServerV1Info::test_info PASSED [ 54%] 231s intake/cli/server/tests/test_server.py::TestServerV1Source::test_bad_action PASSED [ 54%] 231s intake/cli/server/tests/test_server.py::TestServerV1Source::test_idle_timer FAILED [ 54%] 231s intake/cli/server/tests/test_server.py::TestServerV1Source::test_no_format FAILED [ 54%] 231s intake/cli/server/tests/test_server.py::TestServerV1Source::test_open FAILED [ 55%] 231s intake/cli/server/tests/test_server.py::TestServerV1Source::test_open_direct PASSED [ 55%] 231s intake/cli/server/tests/test_server.py::TestServerV1Source::test_read_part_compressed SKIPPED [ 55%] 231s intake/cli/server/tests/test_server.py::TestServerV1Source::test_read_partition SKIPPED [ 55%] 231s intake/cli/server/tests/test_server.py::test_flatten_flag PASSED [ 56%] 232s intake/cli/server/tests/test_server.py::test_port_flag PASSED [ 56%] 232s intake/cli/tests/test_util.py::test_print_entry_info PASSED [ 56%] 232s intake/cli/tests/test_util.py::test_die PASSED [ 56%] 232s intake/cli/tests/test_util.py::Test_nice_join::test_default PASSED [ 57%] 232s intake/cli/tests/test_util.py::Test_nice_join::test_string_conjunction PASSED [ 57%] 232s intake/cli/tests/test_util.py::Test_nice_join::test_None_conjunction PASSED [ 57%] 232s intake/cli/tests/test_util.py::Test_nice_join::test_sep PASSED [ 57%] 232s intake/cli/tests/test_util.py::TestSubcommand::test_initialize_abstract PASSED [ 58%] 232s intake/cli/tests/test_util.py::TestSubcommand::test_invoke_abstract PASSED [ 58%] 232s intake/container/tests/test_generics.py::test_generic_dataframe PASSED [ 58%] 233s intake/container/tests/test_persist.py::test_store PASSED [ 58%] 233s intake/container/tests/test_persist.py::test_backtrack PASSED [ 58%] 233s intake/container/tests/test_persist.py::test_persist_with_nonnumeric_ttl_raises_error PASSED [ 59%] 233s intake/container/tests/test_persist.py::test_undask_persist SKIPPED [ 59%] 233s intake/interface/tests/test_init_gui.py::test_no_panel_does_not_raise_errors ERROR [ 59%] 233s intake/interface/tests/test_init_gui.py::test_no_panel_display_init_gui ERROR [ 59%] 233s intake/interface/tests/test_init_gui.py::test_display_init_gui ERROR [ 60%] 233s intake/source/tests/test_base.py::test_datasource_base_method_exceptions PASSED [ 60%] 233s intake/source/tests/test_base.py::test_name PASSED [ 60%] 233s intake/source/tests/test_base.py::test_datasource_base_context_manager PASSED [ 60%] 233s intake/source/tests/test_base.py::test_datasource_discover PASSED [ 61%] 233s intake/source/tests/test_base.py::test_datasource_read PASSED [ 61%] 233s intake/source/tests/test_base.py::test_datasource_read_chunked PASSED [ 61%] 233s intake/source/tests/test_base.py::test_datasource_read_partition PASSED [ 61%] 233s intake/source/tests/test_base.py::test_datasource_read_partition_out_of_range PASSED [ 62%] 233s intake/source/tests/test_base.py::test_datasource_to_dask PASSED [ 62%] 233s intake/source/tests/test_base.py::test_datasource_close PASSED [ 62%] 233s intake/source/tests/test_base.py::test_datasource_context_manager PASSED [ 62%] 233s intake/source/tests/test_base.py::test_datasource_pickle PASSED [ 62%] 233s intake/source/tests/test_base.py::test_datasource_python_discover PASSED [ 63%] 233s intake/source/tests/test_base.py::test_datasource_python_read PASSED [ 63%] 234s intake/source/tests/test_base.py::test_datasource_python_to_dask PASSED [ 63%] 234s intake/source/tests/test_base.py::test_yaml_method PASSED [ 63%] 234s intake/source/tests/test_base.py::test_alias_fail PASSED [ 64%] 234s intake/source/tests/test_base.py::test_reconfigure PASSED [ 64%] 234s intake/source/tests/test_base.py::test_import_name[data0] PASSED [ 64%] 234s intake/source/tests/test_base.py::test_import_name[data1] PASSED [ 64%] 234s intake/source/tests/test_base.py::test_import_name[data2] PASSED [ 65%] 234s intake/source/tests/test_base.py::test_import_name[data3] PASSED [ 65%] 234s intake/source/tests/test_base.py::test_import_name[data4] PASSED [ 65%] 234s intake/source/tests/test_cache.py::test_ensure_cache_dir PASSED [ 65%] 234s intake/source/tests/test_cache.py::test_munge_path PASSED [ 66%] 234s intake/source/tests/test_cache.py::test_hash PASSED [ 66%] 234s intake/source/tests/test_cache.py::test_path PASSED [ 66%] 234s intake/source/tests/test_cache.py::test_dir_cache PASSED [ 66%] 234s intake/source/tests/test_cache.py::test_compressed_cache PASSED [ 66%] 234s intake/source/tests/test_cache.py::test_filtered_compressed_cache PASSED [ 67%] 234s intake/source/tests/test_cache.py::test_cache_to_cat PASSED [ 67%] 234s intake/source/tests/test_cache.py::test_compressed_cache_infer PASSED [ 67%] 234s intake/source/tests/test_cache.py::test_compressions[tgz] PASSED [ 67%] 234s intake/source/tests/test_cache.py::test_compressions[tbz] PASSED [ 68%] 234s intake/source/tests/test_cache.py::test_compressions[tar] PASSED [ 68%] 234s intake/source/tests/test_cache.py::test_compressions[gz] PASSED [ 68%] 234s intake/source/tests/test_cache.py::test_compressions[bz] PASSED [ 68%] 234s intake/source/tests/test_cache.py::test_compressed_cache_bad PASSED [ 69%] 234s intake/source/tests/test_cache.py::test_dat SKIPPED (DAT not avaiable) [ 69%] 234s intake/source/tests/test_csv.py::test_csv_plugin PASSED [ 69%] 234s intake/source/tests/test_csv.py::test_open PASSED [ 69%] 234s intake/source/tests/test_csv.py::test_discover PASSED [ 70%] 234s intake/source/tests/test_csv.py::test_read PASSED [ 70%] 234s intake/source/tests/test_csv.py::test_read_list PASSED [ 70%] 234s intake/source/tests/test_csv.py::test_read_chunked PASSED [ 70%] 234s intake/source/tests/test_csv.py::test_read_pattern PASSED [ 70%] 234s intake/source/tests/test_csv.py::test_read_pattern_with_cache PASSED [ 71%] 235s intake/source/tests/test_csv.py::test_read_pattern_with_path_as_pattern_str PASSED [ 71%] 235s intake/source/tests/test_csv.py::test_read_partition PASSED [ 71%] 235s intake/source/tests/test_csv.py::test_to_dask PASSED [ 71%] 235s intake/source/tests/test_csv.py::test_plot SKIPPED (could not import...) [ 72%] 235s intake/source/tests/test_csv.py::test_close PASSED [ 72%] 235s intake/source/tests/test_csv.py::test_pickle PASSED [ 72%] 235s intake/source/tests/test_derived.py::test_columns PASSED [ 72%] 235s intake/source/tests/test_derived.py::test_df_transform PASSED [ 73%] 235s intake/source/tests/test_derived.py::test_barebones PASSED [ 73%] 235s intake/source/tests/test_derived.py::test_other_cat FAILED [ 73%] 235s intake/source/tests/test_discovery.py::test_package_scan PASSED [ 73%] 235s intake/source/tests/test_discovery.py::test_discover_cli PASSED [ 74%] 235s intake/source/tests/test_discovery.py::test_discover PASSED [ 74%] 235s intake/source/tests/test_discovery.py::test_enable_and_disable PASSED [ 74%] 235s intake/source/tests/test_discovery.py::test_discover_collision PASSED [ 74%] 235s intake/source/tests/test_json.py::test_jsonfile[None] PASSED [ 75%] 235s intake/source/tests/test_json.py::test_jsonfile[gzip] PASSED [ 75%] 235s intake/source/tests/test_json.py::test_jsonfile[bz2] PASSED [ 75%] 235s intake/source/tests/test_json.py::test_jsonfile_none[None] PASSED [ 75%] 235s intake/source/tests/test_json.py::test_jsonfile_none[gzip] PASSED [ 75%] 235s intake/source/tests/test_json.py::test_jsonfile_none[bz2] PASSED [ 76%] 235s intake/source/tests/test_json.py::test_jsonfile_discover[None] PASSED [ 76%] 235s intake/source/tests/test_json.py::test_jsonfile_discover[gzip] PASSED [ 76%] 235s intake/source/tests/test_json.py::test_jsonfile_discover[bz2] PASSED [ 76%] 235s intake/source/tests/test_json.py::test_jsonlfile[None] PASSED [ 77%] 235s intake/source/tests/test_json.py::test_jsonlfile[gzip] PASSED [ 77%] 235s intake/source/tests/test_json.py::test_jsonlfile[bz2] PASSED [ 77%] 235s intake/source/tests/test_json.py::test_jsonfilel_none[None] PASSED [ 77%] 235s intake/source/tests/test_json.py::test_jsonfilel_none[gzip] PASSED [ 78%] 235s intake/source/tests/test_json.py::test_jsonfilel_none[bz2] PASSED [ 78%] 235s intake/source/tests/test_json.py::test_jsonfilel_discover[None] PASSED [ 78%] 235s intake/source/tests/test_json.py::test_jsonfilel_discover[gzip] PASSED [ 78%] 235s intake/source/tests/test_json.py::test_jsonfilel_discover[bz2] PASSED [ 79%] 235s intake/source/tests/test_json.py::test_jsonl_head[None] PASSED [ 79%] 235s intake/source/tests/test_json.py::test_jsonl_head[gzip] PASSED [ 79%] 235s intake/source/tests/test_json.py::test_jsonl_head[bz2] PASSED [ 79%] 235s intake/source/tests/test_npy.py::test_one_file[shape0] PASSED [ 79%] 235s intake/source/tests/test_npy.py::test_one_file[shape1] PASSED [ 80%] 235s intake/source/tests/test_npy.py::test_one_file[shape2] PASSED [ 80%] 235s intake/source/tests/test_npy.py::test_one_file[shape3] PASSED [ 80%] 235s intake/source/tests/test_npy.py::test_one_file[shape4] PASSED [ 80%] 235s intake/source/tests/test_npy.py::test_multi_file[shape0] PASSED [ 81%] 235s intake/source/tests/test_npy.py::test_multi_file[shape1] PASSED [ 81%] 235s intake/source/tests/test_npy.py::test_multi_file[shape2] PASSED [ 81%] 235s intake/source/tests/test_npy.py::test_multi_file[shape3] PASSED [ 81%] 235s intake/source/tests/test_npy.py::test_multi_file[shape4] PASSED [ 82%] 235s intake/source/tests/test_npy.py::test_zarr_minimal SKIPPED (could no...) [ 82%] 236s intake/source/tests/test_text.py::test_textfiles PASSED [ 82%] 236s intake/source/tests/test_text.py::test_complex_text[None] PASSED [ 82%] 236s intake/source/tests/test_text.py::test_complex_text[gzip] PASSED [ 83%] 236s intake/source/tests/test_text.py::test_complex_text[bz2] PASSED [ 83%] 237s intake/source/tests/test_text.py::test_complex_bytes[pars0-None] PASSED [ 83%] 237s intake/source/tests/test_text.py::test_complex_bytes[pars0-gzip] PASSED [ 83%] 237s intake/source/tests/test_text.py::test_complex_bytes[pars0-bz2] PASSED [ 83%] 238s intake/source/tests/test_text.py::test_complex_bytes[pars1-None] PASSED [ 84%] 238s intake/source/tests/test_text.py::test_complex_bytes[pars1-gzip] PASSED [ 84%] 238s intake/source/tests/test_text.py::test_complex_bytes[pars1-bz2] PASSED [ 84%] 239s intake/source/tests/test_text.py::test_complex_bytes[pars2-None] PASSED [ 84%] 239s intake/source/tests/test_text.py::test_complex_bytes[pars2-gzip] PASSED [ 85%] 239s intake/source/tests/test_text.py::test_complex_bytes[pars2-bz2] PASSED [ 85%] 240s intake/source/tests/test_text.py::test_complex_bytes[pars3-None] PASSED [ 85%] 240s intake/source/tests/test_text.py::test_complex_bytes[pars3-gzip] PASSED [ 85%] 240s intake/source/tests/test_text.py::test_complex_bytes[pars3-bz2] PASSED [ 86%] 240s intake/source/tests/test_text.py::test_text_persist FAILED [ 86%] 240s intake/source/tests/test_text.py::test_text_export FAILED [ 86%] 240s intake/source/tests/test_utils.py::test_path_to_glob[data/LT05_L1TP_042033_{start_date:%Y%m%d}_{end_date:%Y%m%d}_01_T1_sr_band{band:1d}.tif-data/LT05_L1TP_042033_*_*_01_T1_sr_band*.tif] PASSED [ 86%] 240s intake/source/tests/test_utils.py::test_path_to_glob[data/LT05_L1TP_042033_*_*_01_T1_sr_band*.tif-data/LT05_L1TP_042033_*_*_01_T1_sr_band*.tif] PASSED [ 87%] 240s intake/source/tests/test_utils.py::test_path_to_glob[{year}/{month}/{day}.csv-*/*/*.csv] PASSED [ 87%] 240s intake/source/tests/test_utils.py::test_path_to_glob[data/**/*.csv-data/**/*.csv] PASSED [ 87%] 240s intake/source/tests/test_utils.py::test_path_to_glob[data/{year:4}{month:02}{day:02}.csv-data/*.csv] PASSED [ 87%] 240s intake/source/tests/test_utils.py::test_path_to_glob[{lone_param}-*] PASSED [ 87%] 240s intake/source/tests/test_utils.py::test_reverse_format[*.csv-apple.csv-expected0] PASSED [ 88%] 240s intake/source/tests/test_utils.py::test_reverse_format[{}.csv-apple.csv-expected1] PASSED [ 88%] 240s intake/source/tests/test_utils.py::test_reverse_format[{fruit}.{}-apple.csv-expected2] PASSED [ 88%] 240s intake/source/tests/test_utils.py::test_reverse_format[data//{fruit}.csv-data/apple.csv-expected3] PASSED [ 88%] 240s intake/source/tests/test_utils.py::test_reverse_format[data\\{fruit}.csv-C:\\data\\apple.csv-expected4] PASSED [ 89%] 240s intake/source/tests/test_utils.py::test_reverse_format[data/{fruit}.csv-C:\\data\\apple.csv-expected5] PASSED [ 89%] 240s intake/source/tests/test_utils.py::test_reverse_format[data/{fruit}.csv-data//apple.csv-expected6] PASSED [ 89%] 240s intake/source/tests/test_utils.py::test_reverse_format[{num:d}.csv-k.csv-expected7] PASSED [ 89%] 240s intake/source/tests/test_utils.py::test_reverse_format[{year:d}/{month:d}/{day:d}.csv-2016/2/01.csv-expected8] PASSED [ 90%] 240s intake/source/tests/test_utils.py::test_reverse_format[{year:.4}/{month:.2}/{day:.2}.csv-2016/2/01.csv-expected9] PASSED [ 90%] 240s intake/source/tests/test_utils.py::test_reverse_format[SRLCCTabularDat/Ecoregions_{emissions}_Precip_{model}.csv-/user/examples/SRLCCTabularDat/Ecoregions_a1b_Precip_ECHAM5-MPI.csv-expected10] PASSED [ 90%] 240s intake/source/tests/test_utils.py::test_reverse_format[data_{date:%Y_%m_%d}.csv-data_2016_10_01.csv-expected11] PASSED [ 90%] 240s intake/source/tests/test_utils.py::test_reverse_format[{state}{zip:5}-PA19104-expected12] PASSED [ 91%] 240s intake/source/tests/test_utils.py::test_reverse_format[{state}{zip:5d}.csv-PA19104.csv-expected13] PASSED [ 91%] 240s intake/source/tests/test_utils.py::test_reverse_format[{state:2}{zip:d}.csv-PA19104.csv-expected14] PASSED [ 91%] 240s intake/source/tests/test_utils.py::test_roundtripping_reverse_format[{date:%Y%m%d}-expected0] PASSED [ 91%] 240s intake/source/tests/test_utils.py::test_roundtripping_reverse_format[{num: .2f}-expected1] PASSED [ 91%] 240s intake/source/tests/test_utils.py::test_roundtripping_reverse_format[{percentage:.2%}-expected2] PASSED [ 92%] 240s intake/source/tests/test_utils.py::test_roundtripping_reverse_format[data/{year:4d}{month:02d}{day:02d}.csv-expected3] PASSED [ 92%] 240s intake/source/tests/test_utils.py::test_reverse_format_errors PASSED [ 92%] 240s intake/source/tests/test_utils.py::test_roundtrip_reverse_formats[data_{year}_{month}_{day}.csv] PASSED [ 92%] 240s intake/source/tests/test_utils.py::test_roundtrip_reverse_formats[data_{year:d}_{month:02d}_{day:02d}.csv] PASSED [ 93%] 240s intake/source/tests/test_utils.py::test_roundtrip_reverse_formats[data_{date:%Y_%m_%d}.csv] PASSED [ 93%] 240s intake/source/tests/test_utils.py::test_path_to_pattern[http://data/band{band:1d}.tif-metadata0-/band{band:1d}.tif] PASSED [ 93%] 240s intake/source/tests/test_utils.py::test_path_to_pattern[/data/band{band:1d}.tif-metadata1-/data/band{band:1d}.tif] PASSED [ 93%] 240s intake/source/tests/test_utils.py::test_path_to_pattern[/data/band{band:1d}.tif-None-/data/band{band:1d}.tif] PASSED [ 94%] 240s intake/tests/test_config.py::test_load_conf[conf0] PASSED [ 94%] 240s intake/tests/test_config.py::test_load_conf[conf1] PASSED [ 94%] 240s intake/tests/test_config.py::test_load_conf[conf2] PASSED [ 94%] 242s intake/tests/test_config.py::test_basic PASSED [ 95%] 242s intake/tests/test_config.py::test_cli PASSED [ 95%] 243s intake/tests/test_config.py::test_persist_modes PASSED [ 95%] 243s intake/tests/test_config.py::test_conf PASSED [ 95%] 244s intake/tests/test_config.py::test_conf_auth PASSED [ 95%] 244s intake/tests/test_config.py::test_pathdirs PASSED [ 96%] 244s intake/tests/test_top_level.py::test_autoregister_open PASSED [ 96%] 244s intake/tests/test_top_level.py::test_default_catalogs PASSED [ 96%] 244s intake/tests/test_top_level.py::test_user_catalog PASSED [ 96%] 244s intake/tests/test_top_level.py::test_open_styles PASSED [ 97%] 246s intake/tests/test_top_level.py::test_path_catalog PASSED [ 97%] 246s intake/tests/test_top_level.py::test_bad_open PASSED [ 97%] 246s intake/tests/test_top_level.py::test_output_notebook SKIPPED (could ...) [ 97%] 246s intake/tests/test_top_level.py::test_old_usage PASSED [ 98%] 246s intake/tests/test_top_level.py::test_no_imports PASSED [ 98%] 246s intake/tests/test_top_level.py::test_nested_catalog_access PASSED [ 98%] 246s intake/tests/test_utils.py::test_windows_file_path PASSED [ 98%] 246s intake/tests/test_utils.py::test_make_path_posix_removes_double_sep PASSED [ 99%] 246s intake/tests/test_utils.py::test_noops[~/fake.file] PASSED [ 99%] 246s intake/tests/test_utils.py::test_noops[https://example.com] PASSED [ 99%] 246s intake/tests/test_utils.py::test_roundtrip_file_path PASSED [ 99%] 246s intake/tests/test_utils.py::test_yaml_tuples PASSED [100%] 246s 246s ==================================== ERRORS ==================================== 246s ____________ ERROR at setup of test_no_panel_does_not_raise_errors _____________ 246s 246s attr = 'pytest_plugins' 246s 246s def __getattr__(attr): 246s if attr == 'instance': 246s do_import() 246s > return gl['instance'] 246s E KeyError: 'instance' 246s 246s intake/interface/__init__.py:39: KeyError 246s _______________ ERROR at setup of test_no_panel_display_init_gui _______________ 246s 246s attr = 'pytest_plugins' 246s 246s def __getattr__(attr): 246s if attr == 'instance': 246s do_import() 246s > return gl['instance'] 246s E KeyError: 'instance' 246s 246s intake/interface/__init__.py:39: KeyError 246s ___________________ ERROR at setup of test_display_init_gui ____________________ 246s 246s attr = 'pytest_plugins' 246s 246s def __getattr__(attr): 246s if attr == 'instance': 246s do_import() 246s > return gl['instance'] 246s E KeyError: 'instance' 246s 246s intake/interface/__init__.py:39: KeyError 246s =================================== FAILURES =================================== 246s ______________________________ test_load_textfile ______________________________ 246s 246s catalog_cache = 246s 246s def test_load_textfile(catalog_cache): 246s cat = catalog_cache['text_cache'] 246s cache = cat.cache[0] 246s 246s cache_paths = cache.load(cat._urlpath, output=False) 246s > cache_path = cache_paths[-1] 246s E TypeError: 'NoneType' object is not subscriptable 246s 246s intake/catalog/tests/test_caching_integration.py:53: TypeError 246s _________________________________ test_nested __________________________________ 246s 246s args = ('/tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv',) 246s kwargs = {'storage_options': None} 246s func = .read at 0xe6912c08> 246s exc = OSError('An error occurred while calling the read_csv method registered to the pandas backend.\nOriginal Message: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files') 246s 246s @wraps(fn) 246s def wrapper(*args, **kwargs): 246s func = getattr(self, dispatch_name) 246s try: 246s > return func(*args, **kwargs) 246s 246s /usr/lib/python3/dist-packages/dask/backends.py:140: 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s /usr/lib/python3/dist-packages/dask/dataframe/io/csv.py:877: in read 246s return read_pandas( 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s 246s reader = 246s urlpath = '/tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv' 246s blocksize = 'default', lineterminator = '\n', compression = 'infer' 246s sample = 256000, sample_rows = 10, enforce = False, assume_missing = False 246s storage_options = None, include_path_column = False, kwargs = {} 246s reader_name = 'read_csv', b_lineterminator = b'\n', kw = 'chunksize' 246s lastskiprow = 0, firstrow = 0 246s 246s def read_pandas( 246s reader, 246s urlpath, 246s blocksize="default", 246s lineterminator=None, 246s compression="infer", 246s sample=256000, 246s sample_rows=10, 246s enforce=False, 246s assume_missing=False, 246s storage_options=None, 246s include_path_column=False, 246s **kwargs, 246s ): 246s reader_name = reader.__name__ 246s if lineterminator is not None and len(lineterminator) == 1: 246s kwargs["lineterminator"] = lineterminator 246s else: 246s lineterminator = "\n" 246s if "encoding" in kwargs: 246s b_lineterminator = lineterminator.encode(kwargs["encoding"]) 246s empty_blob = "".encode(kwargs["encoding"]) 246s if empty_blob: 246s # This encoding starts with a Byte Order Mark (BOM), so strip that from the 246s # start of the line terminator, since this value is not a full file. 246s b_lineterminator = b_lineterminator[len(empty_blob) :] 246s else: 246s b_lineterminator = lineterminator.encode() 246s if include_path_column and isinstance(include_path_column, bool): 246s include_path_column = "path" 246s if "index" in kwargs or ( 246s "index_col" in kwargs and kwargs.get("index_col") is not False 246s ): 246s raise ValueError( 246s "Keywords 'index' and 'index_col' not supported, except for " 246s "'index_col=False'. Use dd.{reader_name}(...).set_index('my-index') instead" 246s ) 246s for kw in ["iterator", "chunksize"]: 246s if kw in kwargs: 246s raise ValueError(f"{kw} not supported for dd.{reader_name}") 246s if kwargs.get("nrows", None): 246s raise ValueError( 246s "The 'nrows' keyword is not supported by " 246s "`dd.{0}`. To achieve the same behavior, it's " 246s "recommended to use `dd.{0}(...)." 246s "head(n=nrows)`".format(reader_name) 246s ) 246s if isinstance(kwargs.get("skiprows"), int): 246s lastskiprow = firstrow = kwargs.get("skiprows") 246s elif kwargs.get("skiprows") is None: 246s lastskiprow = firstrow = 0 246s else: 246s # When skiprows is a list, we expect more than max(skiprows) to 246s # be included in the sample. This means that [0,2] will work well, 246s # but [0, 440] might not work. 246s skiprows = set(kwargs.get("skiprows")) 246s lastskiprow = max(skiprows) 246s # find the firstrow that is not skipped, for use as header 246s firstrow = min(set(range(len(skiprows) + 1)) - set(skiprows)) 246s if isinstance(kwargs.get("header"), list): 246s raise TypeError(f"List of header rows not supported for dd.{reader_name}") 246s if isinstance(kwargs.get("converters"), dict) and include_path_column: 246s path_converter = kwargs.get("converters").get(include_path_column, None) 246s else: 246s path_converter = None 246s 246s # If compression is "infer", inspect the (first) path suffix and 246s # set the proper compression option if the suffix is recognized. 246s if compression == "infer": 246s # Translate the input urlpath to a simple path list 246s paths = get_fs_token_paths(urlpath, mode="rb", storage_options=storage_options)[ 246s 2 246s ] 246s 246s # Check for at least one valid path 246s if len(paths) == 0: 246s > raise OSError(f"{urlpath} resolved to no files") 246s E OSError: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files 246s 246s /usr/lib/python3/dist-packages/dask/dataframe/io/csv.py:644: OSError 246s 246s The above exception was the direct cause of the following exception: 246s 246s catalog1 = 246s 246s def test_nested(catalog1): 246s assert 'nested' in catalog1 246s assert 'entry1' in catalog1.nested.nested() 246s > assert catalog1.entry1.read().equals(catalog1.nested.nested.entry1.read()) 246s 246s intake/catalog/tests/test_local.py:86: 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s intake/source/csv.py:129: in read 246s self._get_schema() 246s intake/source/csv.py:115: in _get_schema 246s self._open_dataset(urlpath) 246s intake/source/csv.py:94: in _open_dataset 246s self._dataframe = dask.dataframe.read_csv( 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s 246s args = ('/tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv',) 246s kwargs = {'storage_options': None} 246s func = .read at 0xe6912c08> 246s exc = OSError('An error occurred while calling the read_csv method registered to the pandas backend.\nOriginal Message: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files') 246s 246s @wraps(fn) 246s def wrapper(*args, **kwargs): 246s func = getattr(self, dispatch_name) 246s try: 246s return func(*args, **kwargs) 246s except Exception as e: 246s try: 246s exc = type(e)( 246s f"An error occurred while calling the {funcname(func)} " 246s f"method registered to the {self.backend} backend.\n" 246s f"Original Message: {e}" 246s ) 246s except TypeError: 246s raise e 246s else: 246s > raise exc from e 246s E OSError: An error occurred while calling the read_csv method registered to the pandas backend. 246s E Original Message: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files 246s 246s /usr/lib/python3/dist-packages/dask/backends.py:151: OSError 246s ______________________________ test_info_describe ______________________________ 246s 246s intake_server = 'intake://localhost:7483' 246s 246s def test_info_describe(intake_server): 246s catalog = open_catalog(intake_server) 246s 246s assert_items_equal(list(catalog), ['use_example1', 'nested', 'entry1', 246s 'entry1_part', 'remote_env', 246s 'local_env', 'text', 'arr', 'datetime']) 246s 246s > info = catalog['entry1'].describe() 246s 246s intake/catalog/tests/test_remote_integration.py:29: 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s intake/catalog/base.py:436: in __getitem__ 246s s = self._get_entry(key) 246s intake/catalog/utils.py:45: in wrapper 246s return f(self, *args, **kwargs) 246s intake/catalog/base.py:323: in _get_entry 246s return entry() 246s intake/catalog/entry.py:77: in __call__ 246s s = self.get(**kwargs) 246s intake/catalog/remote.py:459: in get 246s return open_remote( 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s 246s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 246s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 246s page_size = None, persist_mode = 'default' 246s auth = , getenv = True 246s getshell = True 246s 246s def open_remote(url, entry, container, user_parameters, description, http_args, 246s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 246s """Create either local direct data source or remote streamed source""" 246s from intake.container import container_map 246s import msgpack 246s import requests 246s from requests.compat import urljoin 246s 246s if url.startswith('intake://'): 246s url = url[len('intake://'):] 246s payload = dict(action='open', 246s name=entry, 246s parameters=user_parameters, 246s available_plugins=list(plugin_registry)) 246s req = requests.post(urljoin(url, 'v1/source'), 246s data=msgpack.packb(payload, **pack_kwargs), 246s **http_args) 246s if req.ok: 246s response = msgpack.unpackb(req.content, **unpack_kwargs) 246s 246s if 'plugin' in response: 246s pl = response['plugin'] 246s pl = [pl] if isinstance(pl, str) else pl 246s # Direct access 246s for p in pl: 246s if p in plugin_registry: 246s source = plugin_registry[p](**response['args']) 246s proxy = False 246s break 246s else: 246s proxy = True 246s else: 246s proxy = True 246s if proxy: 246s response.pop('container') 246s response.update({'name': entry, 'parameters': user_parameters}) 246s if container == 'catalog': 246s response.update({'auth': auth, 246s 'getenv': getenv, 246s 'getshell': getshell, 246s 'page_size': page_size, 246s 'persist_mode': persist_mode 246s # TODO ttl? 246s # TODO storage_options? 246s }) 246s source = container_map[container](url, http_args, **response) 246s source.description = description 246s return source 246s else: 246s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 246s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files 246s 246s intake/catalog/remote.py:519: Exception 246s ---------------------------- Captured stderr setup ----------------------------- 246s 2025-11-17 17:02:32,797 - intake - INFO - __main__.py:main:L53 - Creating catalog from: 246s 2025-11-17 17:02:32,797 - intake - INFO - __main__.py:main:L55 - - /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests/catalog1.yml 246s 2025-11-17 17:02:33,163 - intake - INFO - __main__.py:main:L62 - catalog_args: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests/catalog1.yml 246s 2025-11-17 17:02:33,163 - intake - INFO - __main__.py:main:L70 - Listening on localhost:7483 246s ----------------------------- Captured stderr call ----------------------------- 246s /usr/lib/python3/dist-packages/dask/dataframe/__init__.py:49: FutureWarning: 246s Dask dataframe query planning is disabled because dask-expr is not installed. 246s 246s You can install it with `pip install dask[dataframe]` or `conda install dask`. 246s This will raise in a future version. 246s 246s warnings.warn(msg, FutureWarning) 246s Traceback (most recent call last): 246s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 246s return func(*args, **kwargs) 246s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 246s return read_pandas( 246s reader, 246s ...<10 lines>... 246s **kwargs, 246s ) 246s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 246s raise OSError(f"{urlpath} resolved to no files") 246s OSError: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files 246s 246s The above exception was the direct cause of the following exception: 246s 246s Traceback (most recent call last): 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/cli/server/server.py", line 306, in post 246s source.discover() 246s ~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/base.py", line 347, in discover 246s self._load_metadata() 246s ~~~~~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/base.py", line 285, in _load_metadata 246s self._schema = self._get_schema() 246s ~~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/csv.py", line 115, in _get_schema 246s self._open_dataset(urlpath) 246s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/csv.py", line 94, in _open_dataset 246s self._dataframe = dask.dataframe.read_csv( 246s ~~~~~~~~~~~~~~~~~~~~~~~^ 246s urlpath, storage_options=self._storage_options, 246s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 246s **self._csv_kwargs) 246s ^^^^^^^^^^^^^^^^^^^ 246s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 246s raise exc from e 246s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 246s Original Message: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files 246s 400 POST /v1/source (127.0.0.1): Discover failed 246s 400 POST /v1/source (127.0.0.1) 180.01ms 246s ______________________________ test_remote_direct ______________________________ 246s 246s intake_server = 'intake://localhost:7483' 246s 246s def test_remote_direct(intake_server): 246s from intake.container.dataframe import RemoteDataFrame 246s catalog = open_catalog(intake_server) 246s > s0 = catalog.entry1() 246s 246s intake/catalog/tests/test_remote_integration.py:74: 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s intake/catalog/base.py:391: in __getattr__ 246s return self[item] # triggers reload_on_change 246s intake/catalog/base.py:436: in __getitem__ 246s s = self._get_entry(key) 246s intake/catalog/utils.py:45: in wrapper 246s return f(self, *args, **kwargs) 246s intake/catalog/base.py:323: in _get_entry 246s return entry() 246s intake/catalog/entry.py:77: in __call__ 246s s = self.get(**kwargs) 246s intake/catalog/remote.py:459: in get 246s return open_remote( 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s 246s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 246s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 246s page_size = None, persist_mode = 'default' 246s auth = , getenv = True 246s getshell = True 246s 246s def open_remote(url, entry, container, user_parameters, description, http_args, 246s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 246s """Create either local direct data source or remote streamed source""" 246s from intake.container import container_map 246s import msgpack 246s import requests 246s from requests.compat import urljoin 246s 246s if url.startswith('intake://'): 246s url = url[len('intake://'):] 246s payload = dict(action='open', 246s name=entry, 246s parameters=user_parameters, 246s available_plugins=list(plugin_registry)) 246s req = requests.post(urljoin(url, 'v1/source'), 246s data=msgpack.packb(payload, **pack_kwargs), 246s **http_args) 246s if req.ok: 246s response = msgpack.unpackb(req.content, **unpack_kwargs) 246s 246s if 'plugin' in response: 246s pl = response['plugin'] 246s pl = [pl] if isinstance(pl, str) else pl 246s # Direct access 246s for p in pl: 246s if p in plugin_registry: 246s source = plugin_registry[p](**response['args']) 246s proxy = False 246s break 246s else: 246s proxy = True 246s else: 246s proxy = True 246s if proxy: 246s response.pop('container') 246s response.update({'name': entry, 'parameters': user_parameters}) 246s if container == 'catalog': 246s response.update({'auth': auth, 246s 'getenv': getenv, 246s 'getshell': getshell, 246s 'page_size': page_size, 246s 'persist_mode': persist_mode 246s # TODO ttl? 246s # TODO storage_options? 246s }) 246s source = container_map[container](url, http_args, **response) 246s source.description = description 246s return source 246s else: 246s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 246s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files 246s 246s intake/catalog/remote.py:519: Exception 246s ----------------------------- Captured stderr call ----------------------------- 246s Traceback (most recent call last): 246s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 246s return func(*args, **kwargs) 246s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 246s return read_pandas( 246s reader, 246s ...<10 lines>... 246s **kwargs, 246s ) 246s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 246s raise OSError(f"{urlpath} resolved to no files") 246s OSError: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files 246s 246s The above exception was the direct cause of the following exception: 246s 246s Traceback (most recent call last): 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/cli/server/server.py", line 306, in post 246s source.discover() 246s ~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/base.py", line 347, in discover 246s self._load_metadata() 246s ~~~~~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/base.py", line 285, in _load_metadata 246s self._schema = self._get_schema() 246s ~~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/csv.py", line 115, in _get_schema 246s self._open_dataset(urlpath) 246s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/csv.py", line 94, in _open_dataset 246s self._dataframe = dask.dataframe.read_csv( 246s ~~~~~~~~~~~~~~~~~~~~~~~^ 246s urlpath, storage_options=self._storage_options, 246s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 246s **self._csv_kwargs) 246s ^^^^^^^^^^^^^^^^^^^ 246s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 246s raise exc from e 246s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 246s Original Message: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files 246s 400 POST /v1/source (127.0.0.1): Discover failed 246s 400 POST /v1/source (127.0.0.1) 8.16ms 246s _______________________ test_remote_datasource_interface _______________________ 246s 246s intake_server = 'intake://localhost:7483' 246s 246s def test_remote_datasource_interface(intake_server): 246s catalog = open_catalog(intake_server) 246s 246s > d = catalog['entry1'] 246s 246s intake/catalog/tests/test_remote_integration.py:101: 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s intake/catalog/base.py:436: in __getitem__ 246s s = self._get_entry(key) 246s intake/catalog/utils.py:45: in wrapper 246s return f(self, *args, **kwargs) 246s intake/catalog/base.py:323: in _get_entry 246s return entry() 246s intake/catalog/entry.py:77: in __call__ 246s s = self.get(**kwargs) 246s intake/catalog/remote.py:459: in get 246s return open_remote( 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s 246s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 246s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 246s page_size = None, persist_mode = 'default' 246s auth = , getenv = True 246s getshell = True 246s 246s def open_remote(url, entry, container, user_parameters, description, http_args, 246s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 246s """Create either local direct data source or remote streamed source""" 246s from intake.container import container_map 246s import msgpack 246s import requests 246s from requests.compat import urljoin 246s 246s if url.startswith('intake://'): 246s url = url[len('intake://'):] 246s payload = dict(action='open', 246s name=entry, 246s parameters=user_parameters, 246s available_plugins=list(plugin_registry)) 246s req = requests.post(urljoin(url, 'v1/source'), 246s data=msgpack.packb(payload, **pack_kwargs), 246s **http_args) 246s if req.ok: 246s response = msgpack.unpackb(req.content, **unpack_kwargs) 246s 246s if 'plugin' in response: 246s pl = response['plugin'] 246s pl = [pl] if isinstance(pl, str) else pl 246s # Direct access 246s for p in pl: 246s if p in plugin_registry: 246s source = plugin_registry[p](**response['args']) 246s proxy = False 246s break 246s else: 246s proxy = True 246s else: 246s proxy = True 246s if proxy: 246s response.pop('container') 246s response.update({'name': entry, 'parameters': user_parameters}) 246s if container == 'catalog': 246s response.update({'auth': auth, 246s 'getenv': getenv, 246s 'getshell': getshell, 246s 'page_size': page_size, 246s 'persist_mode': persist_mode 246s # TODO ttl? 246s # TODO storage_options? 246s }) 246s source = container_map[container](url, http_args, **response) 246s source.description = description 246s return source 246s else: 246s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 246s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files 246s 246s intake/catalog/remote.py:519: Exception 246s ----------------------------- Captured stderr call ----------------------------- 246s Traceback (most recent call last): 246s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 246s return func(*args, **kwargs) 246s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 246s return read_pandas( 246s reader, 246s ...<10 lines>... 246s **kwargs, 246s ) 246s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 246s raise OSError(f"{urlpath} resolved to no files") 246s OSError: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files 246s 246s The above exception was the direct cause of the following exception: 246s 246s Traceback (most recent call last): 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/cli/server/server.py", line 306, in post 246s source.discover() 246s ~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/base.py", line 347, in discover 246s self._load_metadata() 246s ~~~~~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/base.py", line 285, in _load_metadata 246s self._schema = self._get_schema() 246s ~~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/csv.py", line 115, in _get_schema 246s self._open_dataset(urlpath) 246s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/csv.py", line 94, in _open_dataset 246s self._dataframe = dask.dataframe.read_csv( 246s ~~~~~~~~~~~~~~~~~~~~~~~^ 246s urlpath, storage_options=self._storage_options, 246s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 246s **self._csv_kwargs) 246s ^^^^^^^^^^^^^^^^^^^ 246s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 246s raise exc from e 246s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 246s Original Message: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files 246s 400 POST /v1/source (127.0.0.1): Discover failed 246s 400 POST /v1/source (127.0.0.1) 6.97ms 246s __________________________________ test_read ___________________________________ 246s 246s intake_server = 'intake://localhost:7483' 246s 246s def test_read(intake_server): 246s catalog = open_catalog(intake_server) 246s 246s > d = catalog['entry1'] 246s 246s intake/catalog/tests/test_remote_integration.py:116: 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s intake/catalog/base.py:436: in __getitem__ 246s s = self._get_entry(key) 246s intake/catalog/utils.py:45: in wrapper 246s return f(self, *args, **kwargs) 246s intake/catalog/base.py:323: in _get_entry 246s return entry() 246s intake/catalog/entry.py:77: in __call__ 246s s = self.get(**kwargs) 246s intake/catalog/remote.py:459: in get 246s return open_remote( 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s 246s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 246s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 246s page_size = None, persist_mode = 'default' 246s auth = , getenv = True 246s getshell = True 246s 246s def open_remote(url, entry, container, user_parameters, description, http_args, 246s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 246s """Create either local direct data source or remote streamed source""" 246s from intake.container import container_map 246s import msgpack 246s import requests 246s from requests.compat import urljoin 246s 246s if url.startswith('intake://'): 246s url = url[len('intake://'):] 246s payload = dict(action='open', 246s name=entry, 246s parameters=user_parameters, 246s available_plugins=list(plugin_registry)) 246s req = requests.post(urljoin(url, 'v1/source'), 246s data=msgpack.packb(payload, **pack_kwargs), 246s **http_args) 246s if req.ok: 246s response = msgpack.unpackb(req.content, **unpack_kwargs) 246s 246s if 'plugin' in response: 246s pl = response['plugin'] 246s pl = [pl] if isinstance(pl, str) else pl 246s # Direct access 246s for p in pl: 246s if p in plugin_registry: 246s source = plugin_registry[p](**response['args']) 246s proxy = False 246s break 246s else: 246s proxy = True 246s else: 246s proxy = True 246s if proxy: 246s response.pop('container') 246s response.update({'name': entry, 'parameters': user_parameters}) 246s if container == 'catalog': 246s response.update({'auth': auth, 246s 'getenv': getenv, 246s 'getshell': getshell, 246s 'page_size': page_size, 246s 'persist_mode': persist_mode 246s # TODO ttl? 246s # TODO storage_options? 246s }) 246s source = container_map[container](url, http_args, **response) 246s source.description = description 246s return source 246s else: 246s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 246s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files 246s 246s intake/catalog/remote.py:519: Exception 246s ----------------------------- Captured stderr call ----------------------------- 246s Traceback (most recent call last): 246s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 246s return func(*args, **kwargs) 246s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 246s return read_pandas( 246s reader, 246s ...<10 lines>... 246s **kwargs, 246s ) 246s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 246s raise OSError(f"{urlpath} resolved to no files") 246s OSError: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files 246s 246s The above exception was the direct cause of the following exception: 246s 246s Traceback (most recent call last): 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/cli/server/server.py", line 306, in post 246s source.discover() 246s ~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/base.py", line 347, in discover 246s self._load_metadata() 246s ~~~~~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/base.py", line 285, in _load_metadata 246s self._schema = self._get_schema() 246s ~~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/csv.py", line 115, in _get_schema 246s self._open_dataset(urlpath) 246s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/csv.py", line 94, in _open_dataset 246s self._dataframe = dask.dataframe.read_csv( 246s ~~~~~~~~~~~~~~~~~~~~~~~^ 246s urlpath, storage_options=self._storage_options, 246s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 246s **self._csv_kwargs) 246s ^^^^^^^^^^^^^^^^^^^ 246s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 246s raise exc from e 246s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 246s Original Message: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files 246s 400 POST /v1/source (127.0.0.1): Discover failed 246s 400 POST /v1/source (127.0.0.1) 7.20ms 246s _______________________________ test_read_chunks _______________________________ 246s 246s intake_server = 'intake://localhost:7483' 246s 246s def test_read_chunks(intake_server): 246s catalog = open_catalog(intake_server) 246s 246s > d = catalog.entry1 246s 246s intake/catalog/tests/test_remote_integration.py:170: 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s intake/catalog/base.py:391: in __getattr__ 246s return self[item] # triggers reload_on_change 246s intake/catalog/base.py:436: in __getitem__ 246s s = self._get_entry(key) 246s intake/catalog/utils.py:45: in wrapper 246s return f(self, *args, **kwargs) 246s intake/catalog/base.py:323: in _get_entry 246s return entry() 246s intake/catalog/entry.py:77: in __call__ 246s s = self.get(**kwargs) 246s intake/catalog/remote.py:459: in get 246s return open_remote( 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s 246s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 246s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 246s page_size = None, persist_mode = 'default' 246s auth = , getenv = True 246s getshell = True 246s 246s def open_remote(url, entry, container, user_parameters, description, http_args, 246s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 246s """Create either local direct data source or remote streamed source""" 246s from intake.container import container_map 246s import msgpack 246s import requests 246s from requests.compat import urljoin 246s 246s if url.startswith('intake://'): 246s url = url[len('intake://'):] 246s payload = dict(action='open', 246s name=entry, 246s parameters=user_parameters, 246s available_plugins=list(plugin_registry)) 246s req = requests.post(urljoin(url, 'v1/source'), 246s data=msgpack.packb(payload, **pack_kwargs), 246s **http_args) 246s if req.ok: 246s response = msgpack.unpackb(req.content, **unpack_kwargs) 246s 246s if 'plugin' in response: 246s pl = response['plugin'] 246s pl = [pl] if isinstance(pl, str) else pl 246s # Direct access 246s for p in pl: 246s if p in plugin_registry: 246s source = plugin_registry[p](**response['args']) 246s proxy = False 246s break 246s else: 246s proxy = True 246s else: 246s proxy = True 246s if proxy: 246s response.pop('container') 246s response.update({'name': entry, 'parameters': user_parameters}) 246s if container == 'catalog': 246s response.update({'auth': auth, 246s 'getenv': getenv, 246s 'getshell': getshell, 246s 'page_size': page_size, 246s 'persist_mode': persist_mode 246s # TODO ttl? 246s # TODO storage_options? 246s }) 246s source = container_map[container](url, http_args, **response) 246s source.description = description 246s return source 246s else: 246s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 246s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files 246s 246s intake/catalog/remote.py:519: Exception 246s ----------------------------- Captured stderr call ----------------------------- 246s Traceback (most recent call last): 246s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 246s return func(*args, **kwargs) 246s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 246s return read_pandas( 246s reader, 246s ...<10 lines>... 246s **kwargs, 246s ) 246s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 246s raise OSError(f"{urlpath} resolved to no files") 246s OSError: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files 246s 246s The above exception was the direct cause of the following exception: 246s 246s Traceback (most recent call last): 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/cli/server/server.py", line 306, in post 246s source.discover() 246s ~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/base.py", line 347, in discover 246s self._load_metadata() 246s ~~~~~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/base.py", line 285, in _load_metadata 246s self._schema = self._get_schema() 246s ~~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/csv.py", line 115, in _get_schema 246s self._open_dataset(urlpath) 246s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/csv.py", line 94, in _open_dataset 246s self._dataframe = dask.dataframe.read_csv( 246s ~~~~~~~~~~~~~~~~~~~~~~~^ 246s urlpath, storage_options=self._storage_options, 246s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 246s **self._csv_kwargs) 246s ^^^^^^^^^^^^^^^^^^^ 246s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 246s raise exc from e 246s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 246s Original Message: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files 246s 400 POST /v1/source (127.0.0.1): Discover failed 246s 400 POST /v1/source (127.0.0.1) 7.19ms 246s _____________________________ test_read_partition ______________________________ 246s 246s intake_server = 'intake://localhost:7483' 246s 246s def test_read_partition(intake_server): 246s catalog = open_catalog(intake_server) 246s 246s > d = catalog.entry1 246s 246s intake/catalog/tests/test_remote_integration.py:186: 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s intake/catalog/base.py:391: in __getattr__ 246s return self[item] # triggers reload_on_change 246s intake/catalog/base.py:436: in __getitem__ 246s s = self._get_entry(key) 246s intake/catalog/utils.py:45: in wrapper 246s return f(self, *args, **kwargs) 246s intake/catalog/base.py:323: in _get_entry 246s return entry() 246s intake/catalog/entry.py:77: in __call__ 246s s = self.get(**kwargs) 246s intake/catalog/remote.py:459: in get 246s return open_remote( 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s 246s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 246s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 246s page_size = None, persist_mode = 'default' 246s auth = , getenv = True 246s getshell = True 246s 246s def open_remote(url, entry, container, user_parameters, description, http_args, 246s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 246s """Create either local direct data source or remote streamed source""" 246s from intake.container import container_map 246s import msgpack 246s import requests 246s from requests.compat import urljoin 246s 246s if url.startswith('intake://'): 246s url = url[len('intake://'):] 246s payload = dict(action='open', 246s name=entry, 246s parameters=user_parameters, 246s available_plugins=list(plugin_registry)) 246s req = requests.post(urljoin(url, 'v1/source'), 246s data=msgpack.packb(payload, **pack_kwargs), 246s **http_args) 246s if req.ok: 246s response = msgpack.unpackb(req.content, **unpack_kwargs) 246s 246s if 'plugin' in response: 246s pl = response['plugin'] 246s pl = [pl] if isinstance(pl, str) else pl 246s # Direct access 246s for p in pl: 246s if p in plugin_registry: 246s source = plugin_registry[p](**response['args']) 246s proxy = False 246s break 246s else: 246s proxy = True 246s else: 246s proxy = True 246s if proxy: 246s response.pop('container') 246s response.update({'name': entry, 'parameters': user_parameters}) 246s if container == 'catalog': 246s response.update({'auth': auth, 246s 'getenv': getenv, 246s 'getshell': getshell, 246s 'page_size': page_size, 246s 'persist_mode': persist_mode 246s # TODO ttl? 246s # TODO storage_options? 246s }) 246s source = container_map[container](url, http_args, **response) 246s source.description = description 246s return source 246s else: 246s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 246s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files 246s 246s intake/catalog/remote.py:519: Exception 246s ----------------------------- Captured stderr call ----------------------------- 246s Traceback (most recent call last): 246s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 246s return func(*args, **kwargs) 246s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 246s return read_pandas( 246s reader, 246s ...<10 lines>... 246s **kwargs, 246s ) 246s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 246s raise OSError(f"{urlpath} resolved to no files") 246s OSError: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files 246s 246s The above exception was the direct cause of the following exception: 246s 246s Traceback (most recent call last): 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/cli/server/server.py", line 306, in post 246s source.discover() 246s ~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/base.py", line 347, in discover 246s self._load_metadata() 246s ~~~~~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/base.py", line 285, in _load_metadata 246s self._schema = self._get_schema() 246s ~~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/csv.py", line 115, in _get_schema 246s self._open_dataset(urlpath) 246s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/csv.py", line 94, in _open_dataset 246s self._dataframe = dask.dataframe.read_csv( 246s ~~~~~~~~~~~~~~~~~~~~~~~^ 246s urlpath, storage_options=self._storage_options, 246s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 246s **self._csv_kwargs) 246s ^^^^^^^^^^^^^^^^^^^ 246s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 246s raise exc from e 246s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 246s Original Message: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files 246s 400 POST /v1/source (127.0.0.1): Discover failed 246s 400 POST /v1/source (127.0.0.1) 7.47ms 246s __________________________________ test_close __________________________________ 246s 246s intake_server = 'intake://localhost:7483' 246s 246s def test_close(intake_server): 246s catalog = open_catalog(intake_server) 246s 246s > d = catalog.entry1 246s 246s intake/catalog/tests/test_remote_integration.py:201: 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s intake/catalog/base.py:391: in __getattr__ 246s return self[item] # triggers reload_on_change 246s intake/catalog/base.py:436: in __getitem__ 246s s = self._get_entry(key) 246s intake/catalog/utils.py:45: in wrapper 246s return f(self, *args, **kwargs) 246s intake/catalog/base.py:323: in _get_entry 246s return entry() 246s intake/catalog/entry.py:77: in __call__ 246s s = self.get(**kwargs) 246s intake/catalog/remote.py:459: in get 246s return open_remote( 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s 246s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 246s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 246s page_size = None, persist_mode = 'default' 246s auth = , getenv = True 246s getshell = True 246s 246s def open_remote(url, entry, container, user_parameters, description, http_args, 246s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 246s """Create either local direct data source or remote streamed source""" 246s from intake.container import container_map 246s import msgpack 246s import requests 246s from requests.compat import urljoin 246s 246s if url.startswith('intake://'): 246s url = url[len('intake://'):] 246s payload = dict(action='open', 246s name=entry, 246s parameters=user_parameters, 246s available_plugins=list(plugin_registry)) 246s req = requests.post(urljoin(url, 'v1/source'), 246s data=msgpack.packb(payload, **pack_kwargs), 246s **http_args) 246s if req.ok: 246s response = msgpack.unpackb(req.content, **unpack_kwargs) 246s 246s if 'plugin' in response: 246s pl = response['plugin'] 246s pl = [pl] if isinstance(pl, str) else pl 246s # Direct access 246s for p in pl: 246s if p in plugin_registry: 246s source = plugin_registry[p](**response['args']) 246s proxy = False 246s break 246s else: 246s proxy = True 246s else: 246s proxy = True 246s if proxy: 246s response.pop('container') 246s response.update({'name': entry, 'parameters': user_parameters}) 246s if container == 'catalog': 246s response.update({'auth': auth, 246s 'getenv': getenv, 246s 'getshell': getshell, 246s 'page_size': page_size, 246s 'persist_mode': persist_mode 246s # TODO ttl? 246s # TODO storage_options? 246s }) 246s source = container_map[container](url, http_args, **response) 246s source.description = description 246s return source 246s else: 246s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 246s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files 246s 246s intake/catalog/remote.py:519: Exception 246s ----------------------------- Captured stderr call ----------------------------- 246s Traceback (most recent call last): 246s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 246s return func(*args, **kwargs) 246s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 246s return read_pandas( 246s reader, 246s ...<10 lines>... 246s **kwargs, 246s ) 246s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 246s raise OSError(f"{urlpath} resolved to no files") 246s OSError: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files 246s 246s The above exception was the direct cause of the following exception: 246s 246s Traceback (most recent call last): 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/cli/server/server.py", line 306, in post 246s source.discover() 246s ~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/base.py", line 347, in discover 246s self._load_metadata() 246s ~~~~~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/base.py", line 285, in _load_metadata 246s self._schema = self._get_schema() 246s ~~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/csv.py", line 115, in _get_schema 246s self._open_dataset(urlpath) 246s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/csv.py", line 94, in _open_dataset 246s self._dataframe = dask.dataframe.read_csv( 246s ~~~~~~~~~~~~~~~~~~~~~~~^ 246s urlpath, storage_options=self._storage_options, 246s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 246s **self._csv_kwargs) 246s ^^^^^^^^^^^^^^^^^^^ 246s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 246s raise exc from e 246s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 246s Original Message: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files 246s 400 POST /v1/source (127.0.0.1): Discover failed 246s 400 POST /v1/source (127.0.0.1) 7.74ms 246s __________________________________ test_with ___________________________________ 246s 246s intake_server = 'intake://localhost:7483' 246s 246s def test_with(intake_server): 246s catalog = open_catalog(intake_server) 246s 246s > with catalog.entry1 as f: 246s 246s intake/catalog/tests/test_remote_integration.py:208: 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s intake/catalog/base.py:391: in __getattr__ 246s return self[item] # triggers reload_on_change 246s intake/catalog/base.py:436: in __getitem__ 246s s = self._get_entry(key) 246s intake/catalog/utils.py:45: in wrapper 246s return f(self, *args, **kwargs) 246s intake/catalog/base.py:323: in _get_entry 246s return entry() 246s intake/catalog/entry.py:77: in __call__ 246s s = self.get(**kwargs) 246s intake/catalog/remote.py:459: in get 246s return open_remote( 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s 246s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 246s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 246s page_size = None, persist_mode = 'default' 246s auth = , getenv = True 246s getshell = True 246s 246s def open_remote(url, entry, container, user_parameters, description, http_args, 246s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 246s """Create either local direct data source or remote streamed source""" 246s from intake.container import container_map 246s import msgpack 246s import requests 246s from requests.compat import urljoin 246s 246s if url.startswith('intake://'): 246s url = url[len('intake://'):] 246s payload = dict(action='open', 246s name=entry, 246s parameters=user_parameters, 246s available_plugins=list(plugin_registry)) 246s req = requests.post(urljoin(url, 'v1/source'), 246s data=msgpack.packb(payload, **pack_kwargs), 246s **http_args) 246s if req.ok: 246s response = msgpack.unpackb(req.content, **unpack_kwargs) 246s 246s if 'plugin' in response: 246s pl = response['plugin'] 246s pl = [pl] if isinstance(pl, str) else pl 246s # Direct access 246s for p in pl: 246s if p in plugin_registry: 246s source = plugin_registry[p](**response['args']) 246s proxy = False 246s break 246s else: 246s proxy = True 246s else: 246s proxy = True 246s if proxy: 246s response.pop('container') 246s response.update({'name': entry, 'parameters': user_parameters}) 246s if container == 'catalog': 246s response.update({'auth': auth, 246s 'getenv': getenv, 246s 'getshell': getshell, 246s 'page_size': page_size, 246s 'persist_mode': persist_mode 246s # TODO ttl? 246s # TODO storage_options? 246s }) 246s source = container_map[container](url, http_args, **response) 246s source.description = description 246s return source 246s else: 246s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 246s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files 246s 246s intake/catalog/remote.py:519: Exception 246s ----------------------------- Captured stderr call ----------------------------- 246s Traceback (most recent call last): 246s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 246s return func(*args, **kwargs) 246s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 246s return read_pandas( 246s reader, 246s ...<10 lines>... 246s **kwargs, 246s ) 246s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 246s raise OSError(f"{urlpath} resolved to no files") 246s OSError: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files 246s 246s The above exception was the direct cause of the following exception: 246s 246s Traceback (most recent call last): 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/cli/server/server.py", line 306, in post 246s source.discover() 246s ~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/base.py", line 347, in discover 246s self._load_metadata() 246s ~~~~~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/base.py", line 285, in _load_metadata 246s self._schema = self._get_schema() 246s ~~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/csv.py", line 115, in _get_schema 246s self._open_dataset(urlpath) 246s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/csv.py", line 94, in _open_dataset 246s self._dataframe = dask.dataframe.read_csv( 246s ~~~~~~~~~~~~~~~~~~~~~~~^ 246s urlpath, storage_options=self._storage_options, 246s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 246s **self._csv_kwargs) 246s ^^^^^^^^^^^^^^^^^^^ 246s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 246s raise exc from e 246s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 246s Original Message: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files 246s 400 POST /v1/source (127.0.0.1): Discover failed 246s 400 POST /v1/source (127.0.0.1) 9.19ms 246s _________________________________ test_pickle __________________________________ 246s 246s intake_server = 'intake://localhost:7483' 246s 246s def test_pickle(intake_server): 246s catalog = open_catalog(intake_server) 246s 246s > d = catalog.entry1 246s 246s intake/catalog/tests/test_remote_integration.py:215: 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s intake/catalog/base.py:391: in __getattr__ 246s return self[item] # triggers reload_on_change 246s intake/catalog/base.py:436: in __getitem__ 246s s = self._get_entry(key) 246s intake/catalog/utils.py:45: in wrapper 246s return f(self, *args, **kwargs) 246s intake/catalog/base.py:323: in _get_entry 246s return entry() 246s intake/catalog/entry.py:77: in __call__ 246s s = self.get(**kwargs) 246s intake/catalog/remote.py:459: in get 246s return open_remote( 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s 246s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 246s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 246s page_size = None, persist_mode = 'default' 246s auth = , getenv = True 246s getshell = True 246s 246s def open_remote(url, entry, container, user_parameters, description, http_args, 246s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 246s """Create either local direct data source or remote streamed source""" 246s from intake.container import container_map 246s import msgpack 246s import requests 246s from requests.compat import urljoin 246s 246s if url.startswith('intake://'): 246s url = url[len('intake://'):] 246s payload = dict(action='open', 246s name=entry, 246s parameters=user_parameters, 246s available_plugins=list(plugin_registry)) 246s req = requests.post(urljoin(url, 'v1/source'), 246s data=msgpack.packb(payload, **pack_kwargs), 246s **http_args) 246s if req.ok: 246s response = msgpack.unpackb(req.content, **unpack_kwargs) 246s 246s if 'plugin' in response: 246s pl = response['plugin'] 246s pl = [pl] if isinstance(pl, str) else pl 246s # Direct access 246s for p in pl: 246s if p in plugin_registry: 246s source = plugin_registry[p](**response['args']) 246s proxy = False 246s break 246s else: 246s proxy = True 246s else: 246s proxy = True 246s if proxy: 246s response.pop('container') 246s response.update({'name': entry, 'parameters': user_parameters}) 246s if container == 'catalog': 246s response.update({'auth': auth, 246s 'getenv': getenv, 246s 'getshell': getshell, 246s 'page_size': page_size, 246s 'persist_mode': persist_mode 246s # TODO ttl? 246s # TODO storage_options? 246s }) 246s source = container_map[container](url, http_args, **response) 246s source.description = description 246s return source 246s else: 246s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 246s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files 246s 246s intake/catalog/remote.py:519: Exception 246s ----------------------------- Captured stderr call ----------------------------- 246s Traceback (most recent call last): 246s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 246s return func(*args, **kwargs) 246s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 246s return read_pandas( 246s reader, 246s ...<10 lines>... 246s **kwargs, 246s ) 246s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 246s raise OSError(f"{urlpath} resolved to no files") 246s OSError: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files 246s 246s The above exception was the direct cause of the following exception: 246s 246s Traceback (most recent call last): 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/cli/server/server.py", line 306, in post 246s source.discover() 246s ~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/base.py", line 347, in discover 246s self._load_metadata() 246s ~~~~~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/base.py", line 285, in _load_metadata 246s self._schema = self._get_schema() 246s ~~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/csv.py", line 115, in _get_schema 246s self._open_dataset(urlpath) 246s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/csv.py", line 94, in _open_dataset 246s self._dataframe = dask.dataframe.read_csv( 246s ~~~~~~~~~~~~~~~~~~~~~~~^ 246s urlpath, storage_options=self._storage_options, 246s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 246s **self._csv_kwargs) 246s ^^^^^^^^^^^^^^^^^^^ 246s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 246s raise exc from e 246s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 246s Original Message: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files 246s 400 POST /v1/source (127.0.0.1): Discover failed 246s 400 POST /v1/source (127.0.0.1) 7.30ms 246s _________________________________ test_to_dask _________________________________ 246s 246s intake_server = 'intake://localhost:7483' 246s 246s def test_to_dask(intake_server): 246s catalog = open_catalog(intake_server) 246s > d = catalog.entry1 246s 246s intake/catalog/tests/test_remote_integration.py:231: 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s intake/catalog/base.py:391: in __getattr__ 246s return self[item] # triggers reload_on_change 246s intake/catalog/base.py:436: in __getitem__ 246s s = self._get_entry(key) 246s intake/catalog/utils.py:45: in wrapper 246s return f(self, *args, **kwargs) 246s intake/catalog/base.py:323: in _get_entry 246s return entry() 246s intake/catalog/entry.py:77: in __call__ 246s s = self.get(**kwargs) 246s intake/catalog/remote.py:459: in get 246s return open_remote( 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s 246s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 246s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 246s page_size = None, persist_mode = 'default' 246s auth = , getenv = True 246s getshell = True 246s 246s def open_remote(url, entry, container, user_parameters, description, http_args, 246s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 246s """Create either local direct data source or remote streamed source""" 246s from intake.container import container_map 246s import msgpack 246s import requests 246s from requests.compat import urljoin 246s 246s if url.startswith('intake://'): 246s url = url[len('intake://'):] 246s payload = dict(action='open', 246s name=entry, 246s parameters=user_parameters, 246s available_plugins=list(plugin_registry)) 246s req = requests.post(urljoin(url, 'v1/source'), 246s data=msgpack.packb(payload, **pack_kwargs), 246s **http_args) 246s if req.ok: 246s response = msgpack.unpackb(req.content, **unpack_kwargs) 246s 246s if 'plugin' in response: 246s pl = response['plugin'] 246s pl = [pl] if isinstance(pl, str) else pl 246s # Direct access 246s for p in pl: 246s if p in plugin_registry: 246s source = plugin_registry[p](**response['args']) 246s proxy = False 246s break 246s else: 246s proxy = True 246s else: 246s proxy = True 246s if proxy: 246s response.pop('container') 246s response.update({'name': entry, 'parameters': user_parameters}) 246s if container == 'catalog': 246s response.update({'auth': auth, 246s 'getenv': getenv, 246s 'getshell': getshell, 246s 'page_size': page_size, 246s 'persist_mode': persist_mode 246s # TODO ttl? 246s # TODO storage_options? 246s }) 246s source = container_map[container](url, http_args, **response) 246s source.description = description 246s return source 246s else: 246s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 246s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files 246s 246s intake/catalog/remote.py:519: Exception 246s ----------------------------- Captured stderr call ----------------------------- 246s Traceback (most recent call last): 246s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 246s return func(*args, **kwargs) 246s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 246s return read_pandas( 246s reader, 246s ...<10 lines>... 246s **kwargs, 246s ) 246s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 246s raise OSError(f"{urlpath} resolved to no files") 246s OSError: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files 246s 246s The above exception was the direct cause of the following exception: 246s 246s Traceback (most recent call last): 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/cli/server/server.py", line 306, in post 246s source.discover() 246s ~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/base.py", line 347, in discover 246s self._load_metadata() 246s ~~~~~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/base.py", line 285, in _load_metadata 246s self._schema = self._get_schema() 246s ~~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/csv.py", line 115, in _get_schema 246s self._open_dataset(urlpath) 246s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/csv.py", line 94, in _open_dataset 246s self._dataframe = dask.dataframe.read_csv( 246s ~~~~~~~~~~~~~~~~~~~~~~~^ 246s urlpath, storage_options=self._storage_options, 246s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 246s **self._csv_kwargs) 246s ^^^^^^^^^^^^^^^^^^^ 246s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 246s raise exc from e 246s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 246s Original Message: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests//entry1_*.csv resolved to no files 246s 400 POST /v1/source (127.0.0.1): Discover failed 246s 400 POST /v1/source (127.0.0.1) 8.02ms 246s _____________________________ test_remote_sequence _____________________________ 246s 246s intake_server = 'intake://localhost:7483' 246s 246s def test_remote_sequence(intake_server): 246s import glob 246s d = os.path.dirname(TEST_CATALOG_PATH) 246s catalog = open_catalog(intake_server) 246s assert 'text' in catalog 246s s = catalog.text() 246s s.discover() 246s > assert s.npartitions == len(glob.glob(os.path.join(d, '*.yml'))) 246s E AssertionError: assert 0 == 29 246s E + where 0 = sources:\n text:\n args:\n dtype: null\n extra_metadata:\n catalog_dir: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests/\n headers:\n headers: {}\n name: text\n npartitions: 0\n parameters: {}\n shape:\n - null\n source_id: bfbae428-d237-46ad-985c-5d4ab1515e74\n url: http://localhost:7483/\n description: textfiles in this dir\n driver: intake.container.semistructured.RemoteSequenceSource\n metadata:\n catalog_dir: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests/\n.npartitions 246s E + and 29 = len(['/tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests/catalog.yml', '/tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests/catalog1.yml', '/tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests/catalog_alias.yml', '/tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests/catalog_caching.yml', '/tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests/catalog_dup_parameters.yml', '/tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests/catalog_dup_sources.yml', ...]) 246s E + where ['/tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests/catalog.yml', '/tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests/catalog1.yml', '/tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests/catalog_alias.yml', '/tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests/catalog_caching.yml', '/tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests/catalog_dup_parameters.yml', '/tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests/catalog_dup_sources.yml', ...] = ('/tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests/*.yml') 246s E + where = .glob 246s E + and '/tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests/*.yml' = ('/tmp/autopkgtest.kHxhri/build.SqR/src/intake/catalog/tests', '*.yml') 246s E + where = .join 246s E + where = os.path 246s 246s intake/catalog/tests/test_remote_integration.py:263: AssertionError 246s ___________________________________ test_dir ___________________________________ 246s 246s intake_server = 'intake://localhost:7483' 246s 246s def test_dir(intake_server): 246s PAGE_SIZE = 2 246s catalog = open_catalog(intake_server, page_size=PAGE_SIZE) 246s assert len(catalog._entries._page_cache) == 0 246s assert len(catalog._entries._direct_lookup_cache) == 0 246s assert not catalog._entries.complete 246s 246s with pytest.warns(UserWarning, match="Tab-complete"): 246s key_completions = catalog._ipython_key_completions_() 246s with pytest.warns(UserWarning, match="Tab-complete"): 246s dir_ = dir(catalog) 246s # __dir__ triggers loading the first page. 246s assert len(catalog._entries._page_cache) == 2 246s assert len(catalog._entries._direct_lookup_cache) == 0 246s assert not catalog._entries.complete 246s assert set(key_completions) == set(['use_example1', 'nested']) 246s assert 'metadata' in dir_ # a normal attribute 246s assert 'use_example1' in dir_ # an entry from the first page 246s assert 'arr' not in dir_ # an entry we haven't cached yet 246s 246s # Trigger fetching one specific name. 246s catalog['arr'] 246s with pytest.warns(UserWarning, match="Tab-complete"): 246s dir_ = dir(catalog) 246s with pytest.warns(UserWarning, match="Tab-complete"): 246s key_completions = catalog._ipython_key_completions_() 246s assert 'metadata' in dir_ 246s assert 'arr' in dir_ # an entry cached via direct access 246s assert 'arr' in key_completions 246s 246s # Load everything. 246s list(catalog) 246s assert catalog._entries.complete 246s > with pytest.warns(None) as record: 246s 246s intake/catalog/tests/test_remote_integration.py:338: 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s 246s self = WarningsChecker(record=True), expected_warning = None, match_expr = None 246s 246s def __init__( 246s self, 246s expected_warning: type[Warning] | tuple[type[Warning], ...] = Warning, 246s match_expr: str | Pattern[str] | None = None, 246s *, 246s _ispytest: bool = False, 246s ) -> None: 246s check_ispytest(_ispytest) 246s super().__init__(_ispytest=True) 246s 246s msg = "exceptions must be derived from Warning, not %s" 246s if isinstance(expected_warning, tuple): 246s for exc in expected_warning: 246s if not issubclass(exc, Warning): 246s raise TypeError(msg % type(exc)) 246s expected_warning_tup = expected_warning 246s elif isinstance(expected_warning, type) and issubclass( 246s expected_warning, Warning 246s ): 246s expected_warning_tup = (expected_warning,) 246s else: 246s > raise TypeError(msg % type(expected_warning)) 246s E TypeError: exceptions must be derived from Warning, not 246s 246s /usr/lib/python3/dist-packages/_pytest/recwarn.py:279: TypeError 246s ________________________________ test_discover _________________________________ 246s 246s def test_discover(): 246s cmd = [ex, '-m', 'intake.cli.client', 'discover', TEST_CATALOG_YAML, 246s 'entry1'] 246s process = subprocess.Popen(cmd, stdout=subprocess.PIPE, 246s universal_newlines=True) 246s out, _ = process.communicate() 246s 246s > assert "'dtype':" in out 246s E assert "'dtype':" in '' 246s 246s intake/cli/client/tests/test_local_integration.py:89: AssertionError 246s ----------------------------- Captured stderr call ----------------------------- 246s /usr/lib/python3/dist-packages/dask/dataframe/__init__.py:49: FutureWarning: 246s Dask dataframe query planning is disabled because dask-expr is not installed. 246s 246s You can install it with `pip install dask[dataframe]` or `conda install dask`. 246s This will raise in a future version. 246s 246s warnings.warn(msg, FutureWarning) 246s ERROR: OSError('An error occurred while calling the read_csv method registered to the pandas backend.\nOriginal Message: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/cli/client/tests//entry1_*.csv resolved to no files') 246s ________________________________ test_get_pass _________________________________ 246s 246s def test_get_pass(): 246s cmd = [ex, '-m', 'intake.cli.client', 'get', TEST_CATALOG_YAML, 'entry1'] 246s process = subprocess.Popen(cmd, stdout=subprocess.PIPE, 246s universal_newlines=True) 246s out, _ = process.communicate() 246s 246s > assert 'Charlie1 25.0 3' in out 246s E AssertionError: assert 'Charlie1 25.0 3' in '' 246s 246s intake/cli/client/tests/test_local_integration.py:101: AssertionError 246s ----------------------------- Captured stderr call ----------------------------- 246s /usr/lib/python3/dist-packages/dask/dataframe/__init__.py:49: FutureWarning: 246s Dask dataframe query planning is disabled because dask-expr is not installed. 246s 246s You can install it with `pip install dask[dataframe]` or `conda install dask`. 246s This will raise in a future version. 246s 246s warnings.warn(msg, FutureWarning) 246s ERROR: OSError('An error occurred while calling the read_csv method registered to the pandas backend.\nOriginal Message: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/cli/client/tests//entry1_*.csv resolved to no files') 246s ______________________ TestServerV1Source.test_idle_timer ______________________ 246s 246s self = 246s 246s def test_idle_timer(self): 246s self.server.start_periodic_functions(close_idle_after=0.1, 246s remove_idle_after=0.2) 246s 246s msg = dict(action='open', name='entry1', parameters={}) 246s > resp_msg, = self.make_post_request(msg) 246s 246s intake/cli/server/tests/test_server.py:208: 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s intake/cli/server/tests/test_server.py:96: in make_post_request 246s self.assertEqual(response.code, expected_status) 246s E AssertionError: 400 != 200 246s ----------------------------- Captured stderr call ----------------------------- 246s Traceback (most recent call last): 246s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 246s return func(*args, **kwargs) 246s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 246s return read_pandas( 246s reader, 246s ...<10 lines>... 246s **kwargs, 246s ) 246s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 246s raise OSError(f"{urlpath} resolved to no files") 246s OSError: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/cli/server/tests//entry1_*.csv resolved to no files 246s 246s The above exception was the direct cause of the following exception: 246s 246s Traceback (most recent call last): 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/cli/server/server.py", line 306, in post 246s source.discover() 246s ~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/base.py", line 347, in discover 246s self._load_metadata() 246s ~~~~~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/base.py", line 285, in _load_metadata 246s self._schema = self._get_schema() 246s ~~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/csv.py", line 115, in _get_schema 246s self._open_dataset(urlpath) 246s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/csv.py", line 94, in _open_dataset 246s self._dataframe = dask.dataframe.read_csv( 246s ~~~~~~~~~~~~~~~~~~~~~~~^ 246s urlpath, storage_options=self._storage_options, 246s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 246s **self._csv_kwargs) 246s ^^^^^^^^^^^^^^^^^^^ 246s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 246s raise exc from e 246s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 246s Original Message: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/cli/server/tests//entry1_*.csv resolved to no files 246s ------------------------------ Captured log call ------------------------------- 246s WARNING tornado.general:web.py:1932 400 POST /v1/source (127.0.0.1): Discover failed 246s WARNING tornado.access:web.py:2407 400 POST /v1/source (127.0.0.1) 6.72ms 246s ______________________ TestServerV1Source.test_no_format _______________________ 246s 246s self = 246s 246s def test_no_format(self): 246s msg = dict(action='open', name='entry1', parameters={}) 246s > resp_msg, = self.make_post_request(msg) 246s 246s intake/cli/server/tests/test_server.py:195: 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s intake/cli/server/tests/test_server.py:96: in make_post_request 246s self.assertEqual(response.code, expected_status) 246s E AssertionError: 400 != 200 246s ----------------------------- Captured stderr call ----------------------------- 246s Traceback (most recent call last): 246s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 246s return func(*args, **kwargs) 246s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 246s return read_pandas( 246s reader, 246s ...<10 lines>... 246s **kwargs, 246s ) 246s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 246s raise OSError(f"{urlpath} resolved to no files") 246s OSError: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/cli/server/tests//entry1_*.csv resolved to no files 246s 246s The above exception was the direct cause of the following exception: 246s 246s Traceback (most recent call last): 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/cli/server/server.py", line 306, in post 246s source.discover() 246s ~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/base.py", line 347, in discover 246s self._load_metadata() 246s ~~~~~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/base.py", line 285, in _load_metadata 246s self._schema = self._get_schema() 246s ~~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/csv.py", line 115, in _get_schema 246s self._open_dataset(urlpath) 246s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/csv.py", line 94, in _open_dataset 246s self._dataframe = dask.dataframe.read_csv( 246s ~~~~~~~~~~~~~~~~~~~~~~~^ 246s urlpath, storage_options=self._storage_options, 246s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 246s **self._csv_kwargs) 246s ^^^^^^^^^^^^^^^^^^^ 246s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 246s raise exc from e 246s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 246s Original Message: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/cli/server/tests//entry1_*.csv resolved to no files 246s ------------------------------ Captured log call ------------------------------- 246s WARNING tornado.general:web.py:1932 400 POST /v1/source (127.0.0.1): Discover failed 246s WARNING tornado.access:web.py:2407 400 POST /v1/source (127.0.0.1) 5.80ms 246s _________________________ TestServerV1Source.test_open _________________________ 246s 246s self = 246s 246s def test_open(self): 246s msg = dict(action='open', name='entry1', parameters={}) 246s > resp_msg, = self.make_post_request(msg) 246s 246s intake/cli/server/tests/test_server.py:112: 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s intake/cli/server/tests/test_server.py:96: in make_post_request 246s self.assertEqual(response.code, expected_status) 246s E AssertionError: 400 != 200 246s ----------------------------- Captured stderr call ----------------------------- 246s Traceback (most recent call last): 246s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 246s return func(*args, **kwargs) 246s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 246s return read_pandas( 246s reader, 246s ...<10 lines>... 246s **kwargs, 246s ) 246s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 246s raise OSError(f"{urlpath} resolved to no files") 246s OSError: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/cli/server/tests//entry1_*.csv resolved to no files 246s 246s The above exception was the direct cause of the following exception: 246s 246s Traceback (most recent call last): 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/cli/server/server.py", line 306, in post 246s source.discover() 246s ~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/base.py", line 347, in discover 246s self._load_metadata() 246s ~~~~~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/base.py", line 285, in _load_metadata 246s self._schema = self._get_schema() 246s ~~~~~~~~~~~~~~~~^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/csv.py", line 115, in _get_schema 246s self._open_dataset(urlpath) 246s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 246s File "/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/csv.py", line 94, in _open_dataset 246s self._dataframe = dask.dataframe.read_csv( 246s ~~~~~~~~~~~~~~~~~~~~~~~^ 246s urlpath, storage_options=self._storage_options, 246s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 246s **self._csv_kwargs) 246s ^^^^^^^^^^^^^^^^^^^ 246s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 246s raise exc from e 246s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 246s Original Message: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/cli/server/tests//entry1_*.csv resolved to no files 246s ------------------------------ Captured log call ------------------------------- 246s WARNING tornado.general:web.py:1932 400 POST /v1/source (127.0.0.1): Discover failed 246s WARNING tornado.access:web.py:2407 400 POST /v1/source (127.0.0.1) 5.57ms 246s ________________________________ test_other_cat ________________________________ 246s 246s args = ('/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/tests/../../catalog/tests//entry1_*.csv',) 246s kwargs = {'storage_options': None} 246s func = .read at 0xe6912c08> 246s exc = OSError('An error occurred while calling the read_csv method registered to the pandas backend.\nOriginal Message: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/tests/../../catalog/tests//entry1_*.csv resolved to no files') 246s 246s @wraps(fn) 246s def wrapper(*args, **kwargs): 246s func = getattr(self, dispatch_name) 246s try: 246s > return func(*args, **kwargs) 246s 246s /usr/lib/python3/dist-packages/dask/backends.py:140: 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s /usr/lib/python3/dist-packages/dask/dataframe/io/csv.py:877: in read 246s return read_pandas( 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s 246s reader = 246s urlpath = '/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/tests/../../catalog/tests//entry1_*.csv' 246s blocksize = 'default', lineterminator = '\n', compression = 'infer' 246s sample = 256000, sample_rows = 10, enforce = False, assume_missing = False 246s storage_options = None, include_path_column = False, kwargs = {} 246s reader_name = 'read_csv', b_lineterminator = b'\n', kw = 'chunksize' 246s lastskiprow = 0, firstrow = 0 246s 246s def read_pandas( 246s reader, 246s urlpath, 246s blocksize="default", 246s lineterminator=None, 246s compression="infer", 246s sample=256000, 246s sample_rows=10, 246s enforce=False, 246s assume_missing=False, 246s storage_options=None, 246s include_path_column=False, 246s **kwargs, 246s ): 246s reader_name = reader.__name__ 246s if lineterminator is not None and len(lineterminator) == 1: 246s kwargs["lineterminator"] = lineterminator 246s else: 246s lineterminator = "\n" 246s if "encoding" in kwargs: 246s b_lineterminator = lineterminator.encode(kwargs["encoding"]) 246s empty_blob = "".encode(kwargs["encoding"]) 246s if empty_blob: 246s # This encoding starts with a Byte Order Mark (BOM), so strip that from the 246s # start of the line terminator, since this value is not a full file. 246s b_lineterminator = b_lineterminator[len(empty_blob) :] 246s else: 246s b_lineterminator = lineterminator.encode() 246s if include_path_column and isinstance(include_path_column, bool): 246s include_path_column = "path" 246s if "index" in kwargs or ( 246s "index_col" in kwargs and kwargs.get("index_col") is not False 246s ): 246s raise ValueError( 246s "Keywords 'index' and 'index_col' not supported, except for " 246s "'index_col=False'. Use dd.{reader_name}(...).set_index('my-index') instead" 246s ) 246s for kw in ["iterator", "chunksize"]: 246s if kw in kwargs: 246s raise ValueError(f"{kw} not supported for dd.{reader_name}") 246s if kwargs.get("nrows", None): 246s raise ValueError( 246s "The 'nrows' keyword is not supported by " 246s "`dd.{0}`. To achieve the same behavior, it's " 246s "recommended to use `dd.{0}(...)." 246s "head(n=nrows)`".format(reader_name) 246s ) 246s if isinstance(kwargs.get("skiprows"), int): 246s lastskiprow = firstrow = kwargs.get("skiprows") 246s elif kwargs.get("skiprows") is None: 246s lastskiprow = firstrow = 0 246s else: 246s # When skiprows is a list, we expect more than max(skiprows) to 246s # be included in the sample. This means that [0,2] will work well, 246s # but [0, 440] might not work. 246s skiprows = set(kwargs.get("skiprows")) 246s lastskiprow = max(skiprows) 246s # find the firstrow that is not skipped, for use as header 246s firstrow = min(set(range(len(skiprows) + 1)) - set(skiprows)) 246s if isinstance(kwargs.get("header"), list): 246s raise TypeError(f"List of header rows not supported for dd.{reader_name}") 246s if isinstance(kwargs.get("converters"), dict) and include_path_column: 246s path_converter = kwargs.get("converters").get(include_path_column, None) 246s else: 246s path_converter = None 246s 246s # If compression is "infer", inspect the (first) path suffix and 246s # set the proper compression option if the suffix is recognized. 246s if compression == "infer": 246s # Translate the input urlpath to a simple path list 246s paths = get_fs_token_paths(urlpath, mode="rb", storage_options=storage_options)[ 246s 2 246s ] 246s 246s # Check for at least one valid path 246s if len(paths) == 0: 246s > raise OSError(f"{urlpath} resolved to no files") 246s E OSError: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/tests/../../catalog/tests//entry1_*.csv resolved to no files 246s 246s /usr/lib/python3/dist-packages/dask/dataframe/io/csv.py:644: OSError 246s 246s The above exception was the direct cause of the following exception: 246s 246s def test_other_cat(): 246s cat = intake.open_catalog(catfile) 246s > df1 = cat.other_cat.read() 246s 246s intake/source/tests/test_derived.py:35: 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s intake/source/derived.py:252: in read 246s return self.to_dask().compute() 246s intake/source/derived.py:239: in to_dask 246s self._df = self._transform(self._source.to_dask(), 246s intake/source/csv.py:133: in to_dask 246s self._get_schema() 246s intake/source/csv.py:115: in _get_schema 246s self._open_dataset(urlpath) 246s intake/source/csv.py:94: in _open_dataset 246s self._dataframe = dask.dataframe.read_csv( 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s 246s args = ('/tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/tests/../../catalog/tests//entry1_*.csv',) 246s kwargs = {'storage_options': None} 246s func = .read at 0xe6912c08> 246s exc = OSError('An error occurred while calling the read_csv method registered to the pandas backend.\nOriginal Message: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/tests/../../catalog/tests//entry1_*.csv resolved to no files') 246s 246s @wraps(fn) 246s def wrapper(*args, **kwargs): 246s func = getattr(self, dispatch_name) 246s try: 246s return func(*args, **kwargs) 246s except Exception as e: 246s try: 246s exc = type(e)( 246s f"An error occurred while calling the {funcname(func)} " 246s f"method registered to the {self.backend} backend.\n" 246s f"Original Message: {e}" 246s ) 246s except TypeError: 246s raise e 246s else: 246s > raise exc from e 246s E OSError: An error occurred while calling the read_csv method registered to the pandas backend. 246s E Original Message: /tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/tests/../../catalog/tests//entry1_*.csv resolved to no files 246s 246s /usr/lib/python3/dist-packages/dask/backends.py:151: OSError 246s ______________________________ test_text_persist _______________________________ 246s 246s temp_cache = None 246s 246s def test_text_persist(temp_cache): 246s cat = intake.open_catalog(os.path.join(here, 'sources.yaml')) 246s s = cat.sometext() 246s > s2 = s.persist() 246s 246s intake/source/tests/test_text.py:88: 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s intake/source/base.py:226: in persist 246s out = self._export(store.getdir(self), **kwargs) 246s intake/source/base.py:460: in _export 246s out = method(self, path=path, **kwargs) 246s intake/container/semistructured.py:70: in _persist 246s return RemoteSequenceSource._data_to_source(b, path, encoder, **kwargs) 246s intake/container/semistructured.py:90: in _data_to_source 246s files = open_files(posixpath.join(path, 'part.*'), mode='wt', 246s /usr/lib/python3/dist-packages/fsspec/core.py:295: in open_files 246s fs, fs_token, paths = get_fs_token_paths( 246s /usr/lib/python3/dist-packages/fsspec/core.py:684: in get_fs_token_paths 246s paths = _expand_paths(paths, name_function, num) 246s /usr/lib/python3/dist-packages/fsspec/core.py:701: in _expand_paths 246s name_function = build_name_function(num - 1) 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s 246s max_int = -0.99999999 246s 246s def build_name_function(max_int: float) -> Callable[[int], str]: 246s """Returns a function that receives a single integer 246s and returns it as a string padded by enough zero characters 246s to align with maximum possible integer 246s 246s >>> name_f = build_name_function(57) 246s 246s >>> name_f(7) 246s '07' 246s >>> name_f(31) 246s '31' 246s >>> build_name_function(1000)(42) 246s '0042' 246s >>> build_name_function(999)(42) 246s '042' 246s >>> build_name_function(0)(0) 246s '0' 246s """ 246s # handle corner cases max_int is 0 or exact power of 10 246s max_int += 1e-8 246s 246s > pad_length = int(math.ceil(math.log10(max_int))) 246s E ValueError: math domain error 246s 246s /usr/lib/python3/dist-packages/fsspec/utils.py:177: ValueError 246s _______________________________ test_text_export _______________________________ 246s 246s temp_cache = None 246s 246s def test_text_export(temp_cache): 246s import tempfile 246s outdir = tempfile.mkdtemp() 246s cat = intake.open_catalog(os.path.join(here, 'sources.yaml')) 246s s = cat.sometext() 246s > out = s.export(outdir) 246s 246s intake/source/tests/test_text.py:97: 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s intake/source/base.py:452: in export 246s return self._export(path, **kwargs) 246s intake/source/base.py:460: in _export 246s out = method(self, path=path, **kwargs) 246s intake/container/semistructured.py:70: in _persist 246s return RemoteSequenceSource._data_to_source(b, path, encoder, **kwargs) 246s intake/container/semistructured.py:90: in _data_to_source 246s files = open_files(posixpath.join(path, 'part.*'), mode='wt', 246s /usr/lib/python3/dist-packages/fsspec/core.py:295: in open_files 246s fs, fs_token, paths = get_fs_token_paths( 246s /usr/lib/python3/dist-packages/fsspec/core.py:684: in get_fs_token_paths 246s paths = _expand_paths(paths, name_function, num) 246s /usr/lib/python3/dist-packages/fsspec/core.py:701: in _expand_paths 246s name_function = build_name_function(num - 1) 246s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 246s 246s max_int = -0.99999999 246s 246s def build_name_function(max_int: float) -> Callable[[int], str]: 246s """Returns a function that receives a single integer 246s and returns it as a string padded by enough zero characters 246s to align with maximum possible integer 246s 246s >>> name_f = build_name_function(57) 246s 246s >>> name_f(7) 246s '07' 246s >>> name_f(31) 246s '31' 246s >>> build_name_function(1000)(42) 246s '0042' 246s >>> build_name_function(999)(42) 246s '042' 246s >>> build_name_function(0)(0) 246s '0' 246s """ 246s # handle corner cases max_int is 0 or exact power of 10 246s max_int += 1e-8 246s 246s > pad_length = int(math.ceil(math.log10(max_int))) 246s E ValueError: math domain error 246s 246s /usr/lib/python3/dist-packages/fsspec/utils.py:177: ValueError 246s =============================== warnings summary =============================== 246s intake/catalog/tests/test_alias.py::test_simple 246s /usr/lib/python3/dist-packages/dask/dataframe/__init__.py:49: FutureWarning: 246s Dask dataframe query planning is disabled because dask-expr is not installed. 246s 246s You can install it with `pip install dask[dataframe]` or `conda install dask`. 246s This will raise in a future version. 246s 246s warnings.warn(msg, FutureWarning) 246s 246s intake/source/tests/test_cache.py::test_filtered_compressed_cache 246s intake/source/tests/test_cache.py::test_compressions[tgz] 246s intake/source/tests/test_cache.py::test_compressions[tgz] 246s /tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/decompress.py:27: DeprecationWarning: Python 3.14 will, by default, filter extracted tar archives and reject files or modify their metadata. Use the filter argument to control this behavior. 246s tar.extractall(outpath) 246s 246s intake/source/tests/test_cache.py::test_compressions[tbz] 246s intake/source/tests/test_cache.py::test_compressions[tbz] 246s /tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/decompress.py:37: DeprecationWarning: Python 3.14 will, by default, filter extracted tar archives and reject files or modify their metadata. Use the filter argument to control this behavior. 246s tar.extractall(outpath) 246s 246s intake/source/tests/test_cache.py::test_compressions[tar] 246s intake/source/tests/test_cache.py::test_compressions[tar] 246s /tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/decompress.py:47: DeprecationWarning: Python 3.14 will, by default, filter extracted tar archives and reject files or modify their metadata. Use the filter argument to control this behavior. 246s tar.extractall(outpath) 246s 246s intake/source/tests/test_discovery.py::test_package_scan 246s intake/source/tests/test_discovery.py::test_package_scan 246s intake/source/tests/test_discovery.py::test_enable_and_disable 246s intake/source/tests/test_discovery.py::test_discover_collision 246s /tmp/autopkgtest.kHxhri/build.SqR/src/intake/source/discovery.py:194: PendingDeprecationWarning: Package scanning may be removed 246s warnings.warn("Package scanning may be removed", category=PendingDeprecationWarning) 246s 246s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 246s =========================== short test summary info ============================ 246s FAILED intake/catalog/tests/test_caching_integration.py::test_load_textfile 246s FAILED intake/catalog/tests/test_local.py::test_nested - OSError: An error oc... 246s FAILED intake/catalog/tests/test_remote_integration.py::test_info_describe - ... 246s FAILED intake/catalog/tests/test_remote_integration.py::test_remote_direct - ... 246s FAILED intake/catalog/tests/test_remote_integration.py::test_remote_datasource_interface 246s FAILED intake/catalog/tests/test_remote_integration.py::test_read - Exception... 246s FAILED intake/catalog/tests/test_remote_integration.py::test_read_chunks - Ex... 246s FAILED intake/catalog/tests/test_remote_integration.py::test_read_partition 246s FAILED intake/catalog/tests/test_remote_integration.py::test_close - Exceptio... 246s FAILED intake/catalog/tests/test_remote_integration.py::test_with - Exception... 246s FAILED intake/catalog/tests/test_remote_integration.py::test_pickle - Excepti... 246s FAILED intake/catalog/tests/test_remote_integration.py::test_to_dask - Except... 246s FAILED intake/catalog/tests/test_remote_integration.py::test_remote_sequence 246s FAILED intake/catalog/tests/test_remote_integration.py::test_dir - TypeError:... 246s FAILED intake/cli/client/tests/test_local_integration.py::test_discover - ass... 246s FAILED intake/cli/client/tests/test_local_integration.py::test_get_pass - Ass... 246s FAILED intake/cli/server/tests/test_server.py::TestServerV1Source::test_idle_timer 246s FAILED intake/cli/server/tests/test_server.py::TestServerV1Source::test_no_format 246s FAILED intake/cli/server/tests/test_server.py::TestServerV1Source::test_open 246s FAILED intake/source/tests/test_derived.py::test_other_cat - OSError: An erro... 246s FAILED intake/source/tests/test_text.py::test_text_persist - ValueError: math... 246s FAILED intake/source/tests/test_text.py::test_text_export - ValueError: math ... 246s ERROR intake/interface/tests/test_init_gui.py::test_no_panel_does_not_raise_errors 246s ERROR intake/interface/tests/test_init_gui.py::test_no_panel_display_init_gui 246s ERROR intake/interface/tests/test_init_gui.py::test_display_init_gui - KeyErr... 246s ====== 22 failed, 379 passed, 31 skipped, 12 warnings, 3 errors in 49.77s ====== 246s autopkgtest [17:03:01]: test run-unit-test: -----------------------] 250s run-unit-test FAIL non-zero exit status 1 250s autopkgtest [17:03:05]: test run-unit-test: - - - - - - - - - - results - - - - - - - - - - 254s autopkgtest [17:03:09]: @@@@@@@@@@@@@@@@@@@@ summary 254s run-unit-test FAIL non-zero exit status 1