0s autopkgtest [19:00:46]: starting date and time: 2025-11-17 19:00:46+0000 0s autopkgtest [19:00:46]: git checkout: 4b346b80 nova: make wait_reboot return success even when a no-op 0s autopkgtest [19:00:46]: host juju-7f2275-prod-proposed-migration-environment-20; command line: /home/ubuntu/autopkgtest/runner/autopkgtest --output-dir /tmp/autopkgtest-work.70hnn8sb/out --timeout-copy=6000 --setup-commands /home/ubuntu/autopkgtest-cloud/worker-config-production/setup-canonical.sh --apt-pocket=proposed=src:intake --apt-upgrade intake --timeout-short=300 --timeout-copy=20000 --timeout-build=20000 --env=ADT_TEST_TRIGGERS=intake/0.6.6-4 -- ssh -s /home/ubuntu/autopkgtest/ssh-setup/nova -- --flavor autopkgtest-ppc64el --security-groups autopkgtest-juju-7f2275-prod-proposed-migration-environment-20@bos03-ppc64el-2.secgroup --name adt-resolute-ppc64el-intake-20251117-190046-juju-7f2275-prod-proposed-migration-environment-20-3ee8f7e3-abf8-4f4e-aa8e-f215305083d7 --image adt/ubuntu-resolute-ppc64el-server --keyname testbed-juju-7f2275-prod-proposed-migration-environment-20 --net-id=net_prod-proposed-migration-ppc64el -e TERM=linux --mirror=http://ftpmaster.internal/ubuntu/ 3s Creating nova instance adt-resolute-ppc64el-intake-20251117-190046-juju-7f2275-prod-proposed-migration-environment-20-3ee8f7e3-abf8-4f4e-aa8e-f215305083d7 from image adt/ubuntu-resolute-ppc64el-server-20251117.img (UUID c6f5b741-c77a-45db-84cb-f00b40e77676)... 54s autopkgtest [19:01:40]: testbed dpkg architecture: ppc64el 54s autopkgtest [19:01:40]: testbed apt version: 3.1.11 54s autopkgtest [19:01:40]: @@@@@@@@@@@@@@@@@@@@ test bed setup 54s autopkgtest [19:01:40]: testbed release detected to be: None 55s autopkgtest [19:01:41]: updating testbed package index (apt update) 55s Get:1 http://ftpmaster.internal/ubuntu resolute-proposed InRelease [87.8 kB] 56s Hit:2 http://ftpmaster.internal/ubuntu resolute InRelease 56s Hit:3 http://ftpmaster.internal/ubuntu resolute-updates InRelease 56s Hit:4 http://ftpmaster.internal/ubuntu resolute-security InRelease 56s Get:5 http://ftpmaster.internal/ubuntu resolute-proposed/multiverse Sources [22.9 kB] 56s Get:6 http://ftpmaster.internal/ubuntu resolute-proposed/universe Sources [778 kB] 56s Get:7 http://ftpmaster.internal/ubuntu resolute-proposed/restricted Sources [9848 B] 56s Get:8 http://ftpmaster.internal/ubuntu resolute-proposed/main Sources [72.6 kB] 56s Get:9 http://ftpmaster.internal/ubuntu resolute-proposed/main ppc64el Packages [135 kB] 56s Get:10 http://ftpmaster.internal/ubuntu resolute-proposed/restricted ppc64el Packages [1276 B] 56s Get:11 http://ftpmaster.internal/ubuntu resolute-proposed/universe ppc64el Packages [499 kB] 56s Get:12 http://ftpmaster.internal/ubuntu resolute-proposed/multiverse ppc64el Packages [11.0 kB] 56s Fetched 1618 kB in 1s (1285 kB/s) 57s Reading package lists... 58s Hit:1 http://ftpmaster.internal/ubuntu resolute-proposed InRelease 58s Hit:2 http://ftpmaster.internal/ubuntu resolute InRelease 58s Hit:3 http://ftpmaster.internal/ubuntu resolute-updates InRelease 58s Hit:4 http://ftpmaster.internal/ubuntu resolute-security InRelease 59s Reading package lists... 59s Reading package lists... 59s Building dependency tree... 59s Reading state information... 59s Calculating upgrade... 59s The following packages will be upgraded: 59s apt libapt-pkg7.0 libcrypt-dev libcrypt1 usbutils 59s 5 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 59s Need to get 3180 kB of archives. 59s After this operation, 118 kB of additional disk space will be used. 59s Get:1 http://ftpmaster.internal/ubuntu resolute/main ppc64el libcrypt-dev ppc64el 1:4.5.1-1 [162 kB] 60s Get:2 http://ftpmaster.internal/ubuntu resolute/main ppc64el libcrypt1 ppc64el 1:4.5.1-1 [125 kB] 60s Get:3 http://ftpmaster.internal/ubuntu resolute/main ppc64el libapt-pkg7.0 ppc64el 3.1.12 [1286 kB] 60s Get:4 http://ftpmaster.internal/ubuntu resolute/main ppc64el apt ppc64el 3.1.12 [1516 kB] 60s Get:5 http://ftpmaster.internal/ubuntu resolute/main ppc64el usbutils ppc64el 1:019-1 [91.5 kB] 61s dpkg-preconfigure: unable to re-open stdin: No such file or directory 61s Fetched 3180 kB in 1s (2957 kB/s) 61s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81022 files and directories currently installed.) 61s Preparing to unpack .../libcrypt-dev_1%3a4.5.1-1_ppc64el.deb ... 61s Unpacking libcrypt-dev:ppc64el (1:4.5.1-1) over (1:4.4.38-1build1) ... 61s Preparing to unpack .../libcrypt1_1%3a4.5.1-1_ppc64el.deb ... 61s Unpacking libcrypt1:ppc64el (1:4.5.1-1) over (1:4.4.38-1build1) ... 61s Setting up libcrypt1:ppc64el (1:4.5.1-1) ... 61s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81022 files and directories currently installed.) 61s Preparing to unpack .../libapt-pkg7.0_3.1.12_ppc64el.deb ... 61s Unpacking libapt-pkg7.0:ppc64el (3.1.12) over (3.1.11) ... 62s Preparing to unpack .../apt_3.1.12_ppc64el.deb ... 62s Unpacking apt (3.1.12) over (3.1.11) ... 62s Preparing to unpack .../usbutils_1%3a019-1_ppc64el.deb ... 62s Unpacking usbutils (1:019-1) over (1:018-2) ... 62s Setting up usbutils (1:019-1) ... 62s Setting up libcrypt-dev:ppc64el (1:4.5.1-1) ... 62s Setting up libapt-pkg7.0:ppc64el (3.1.12) ... 62s Setting up apt (3.1.12) ... 63s Processing triggers for man-db (2.13.1-1) ... 65s Processing triggers for libc-bin (2.42-2ubuntu2) ... 65s autopkgtest [19:01:51]: upgrading testbed (apt dist-upgrade and autopurge) 66s Reading package lists... 66s Building dependency tree... 66s Reading state information... 66s Calculating upgrade... 66s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 66s Reading package lists... 66s Building dependency tree... 67s Reading state information... 67s Solving dependencies... 67s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 67s autopkgtest [19:01:53]: rebooting testbed after setup commands that affected boot 94s autopkgtest [19:02:20]: testbed running kernel: Linux 6.17.0-5-generic #5-Ubuntu SMP PREEMPT_DYNAMIC Mon Sep 22 10:02:41 UTC 2025 96s autopkgtest [19:02:22]: @@@@@@@@@@@@@@@@@@@@ apt-source intake 100s Get:1 http://ftpmaster.internal/ubuntu resolute-proposed/universe intake 0.6.6-4 (dsc) [2693 B] 100s Get:2 http://ftpmaster.internal/ubuntu resolute-proposed/universe intake 0.6.6-4 (tar) [4447 kB] 100s Get:3 http://ftpmaster.internal/ubuntu resolute-proposed/universe intake 0.6.6-4 (diff) [15.8 kB] 100s gpgv: Signature made Wed Aug 27 08:46:02 2025 UTC 100s gpgv: using RSA key 8F6DE104377F3B11E741748731F3144544A1741A 100s gpgv: issuer "tchet@debian.org" 100s gpgv: Can't check signature: No public key 100s dpkg-source: warning: cannot verify inline signature for ./intake_0.6.6-4.dsc: no acceptable signature found 100s autopkgtest [19:02:26]: testing package intake version 0.6.6-4 101s autopkgtest [19:02:27]: build not needed 102s autopkgtest [19:02:28]: test run-unit-test: preparing testbed 102s Reading package lists... 102s Building dependency tree... 102s Reading state information... 102s Solving dependencies... 102s The following NEW packages will be installed: 102s fonts-font-awesome fonts-glyphicons-halflings fonts-lato libblas3 102s libgfortran5 libjs-bootstrap libjs-jquery libjs-sphinxdoc libjs-underscore 102s liblapack3 node-html5shiv python3-aiohappyeyeballs python3-aiohttp 102s python3-aiosignal python3-all python3-async-timeout python3-click 102s python3-cloudpickle python3-dask python3-entrypoints python3-frozenlist 102s python3-fsspec python3-iniconfig python3-intake python3-intake-doc 102s python3-locket python3-msgpack python3-msgpack-numpy python3-multidict 102s python3-numpy python3-numpy-dev python3-pandas python3-pandas-lib 102s python3-partd python3-platformdirs python3-pluggy python3-propcache 102s python3-pytest python3-pytz python3-toolz python3-tornado python3-yarl 102s sphinx-rtd-theme-common 102s 0 upgraded, 43 newly installed, 0 to remove and 0 not upgraded. 102s Need to get 29.2 MB of archives. 102s After this operation, 161 MB of additional disk space will be used. 102s Get:1 http://ftpmaster.internal/ubuntu resolute/main ppc64el fonts-lato all 2.015-1 [2781 kB] 103s Get:2 http://ftpmaster.internal/ubuntu resolute/main ppc64el python3-numpy-dev ppc64el 1:2.2.4+ds-1ubuntu1 [153 kB] 103s Get:3 http://ftpmaster.internal/ubuntu resolute/main ppc64el libblas3 ppc64el 3.12.1-7 [291 kB] 103s Get:4 http://ftpmaster.internal/ubuntu resolute/main ppc64el libgfortran5 ppc64el 15.2.0-7ubuntu1 [620 kB] 103s Get:5 http://ftpmaster.internal/ubuntu resolute/main ppc64el liblapack3 ppc64el 3.12.1-7 [2960 kB] 104s Get:6 http://ftpmaster.internal/ubuntu resolute/main ppc64el python3-numpy ppc64el 1:2.2.4+ds-1ubuntu1 [4887 kB] 105s Get:7 http://ftpmaster.internal/ubuntu resolute/main ppc64el fonts-font-awesome all 5.0.10+really4.7.0~dfsg-4.1 [516 kB] 105s Get:8 http://ftpmaster.internal/ubuntu resolute/universe ppc64el fonts-glyphicons-halflings all 1.009~3.4.1+dfsg-6 [119 kB] 105s Get:9 http://ftpmaster.internal/ubuntu resolute/universe ppc64el libjs-bootstrap all 3.4.1+dfsg-6 [129 kB] 105s Get:10 http://ftpmaster.internal/ubuntu resolute/main ppc64el libjs-jquery all 3.6.1+dfsg+~3.5.14-1 [328 kB] 105s Get:11 http://ftpmaster.internal/ubuntu resolute/main ppc64el libjs-underscore all 1.13.4~dfsg+~1.11.4-3 [118 kB] 105s Get:12 http://ftpmaster.internal/ubuntu resolute/main ppc64el libjs-sphinxdoc all 8.2.3-1ubuntu2 [28.0 kB] 105s Get:13 http://ftpmaster.internal/ubuntu resolute/universe ppc64el node-html5shiv all 3.7.3+dfsg-5 [13.5 kB] 105s Get:14 http://ftpmaster.internal/ubuntu resolute/universe ppc64el python3-aiohappyeyeballs all 2.6.1-2 [11.1 kB] 105s Get:15 http://ftpmaster.internal/ubuntu resolute/universe ppc64el python3-multidict ppc64el 6.4.3-1build1 [74.0 kB] 105s Get:16 http://ftpmaster.internal/ubuntu resolute/universe ppc64el python3-propcache ppc64el 0.3.1-1build1 [57.0 kB] 105s Get:17 http://ftpmaster.internal/ubuntu resolute/universe ppc64el python3-yarl ppc64el 1.22.0-1 [106 kB] 105s Get:18 http://ftpmaster.internal/ubuntu resolute/universe ppc64el python3-async-timeout all 5.0.1-1 [6830 B] 105s Get:19 http://ftpmaster.internal/ubuntu resolute/universe ppc64el python3-frozenlist ppc64el 1.8.0-1 [60.5 kB] 105s Get:20 http://ftpmaster.internal/ubuntu resolute/universe ppc64el python3-aiosignal all 1.4.0-1 [5628 B] 105s Get:21 http://ftpmaster.internal/ubuntu resolute/universe ppc64el python3-aiohttp ppc64el 3.13.1-1 [491 kB] 105s Get:22 http://ftpmaster.internal/ubuntu resolute/main ppc64el python3-all ppc64el 3.13.7-1 [884 B] 105s Get:23 http://ftpmaster.internal/ubuntu resolute/main ppc64el python3-click all 8.2.0+0.really.8.1.8-1 [80.0 kB] 105s Get:24 http://ftpmaster.internal/ubuntu resolute/universe ppc64el python3-cloudpickle all 3.1.1-1 [22.4 kB] 105s Get:25 http://ftpmaster.internal/ubuntu resolute/universe ppc64el python3-fsspec all 2025.3.2-1ubuntu1 [217 kB] 105s Get:26 http://ftpmaster.internal/ubuntu resolute/universe ppc64el python3-toolz all 1.0.0-2 [45.0 kB] 105s Get:27 http://ftpmaster.internal/ubuntu resolute/universe ppc64el python3-locket all 1.0.0-2 [5872 B] 105s Get:28 http://ftpmaster.internal/ubuntu resolute/universe ppc64el python3-partd all 1.4.2-1 [15.7 kB] 105s Get:29 http://ftpmaster.internal/ubuntu resolute/universe ppc64el python3-dask all 2024.12.1+dfsg-2 [875 kB] 105s Get:30 http://ftpmaster.internal/ubuntu resolute/universe ppc64el python3-entrypoints all 0.4-3 [7174 B] 105s Get:31 http://ftpmaster.internal/ubuntu resolute/universe ppc64el python3-iniconfig all 2.1.0-1 [6840 B] 105s Get:32 http://ftpmaster.internal/ubuntu resolute/main ppc64el python3-msgpack ppc64el 1.0.3-3build5 [114 kB] 105s Get:33 http://ftpmaster.internal/ubuntu resolute/main ppc64el python3-platformdirs all 4.3.7-1 [16.9 kB] 105s Get:34 http://ftpmaster.internal/ubuntu resolute-proposed/universe ppc64el python3-intake ppc64el 0.6.6-4 [197 kB] 105s Get:35 http://ftpmaster.internal/ubuntu resolute/main ppc64el sphinx-rtd-theme-common all 3.0.2+dfsg-3 [1013 kB] 105s Get:36 http://ftpmaster.internal/ubuntu resolute-proposed/universe ppc64el python3-intake-doc all 0.6.6-4 [1549 kB] 106s Get:37 http://ftpmaster.internal/ubuntu resolute/universe ppc64el python3-msgpack-numpy all 0.4.8-1 [7388 B] 106s Get:38 http://ftpmaster.internal/ubuntu resolute/main ppc64el python3-pytz all 2025.2-4 [32.3 kB] 106s Get:39 http://ftpmaster.internal/ubuntu resolute/universe ppc64el python3-pandas-lib ppc64el 2.3.3+dfsg-1ubuntu1 [7666 kB] 107s Get:40 http://ftpmaster.internal/ubuntu resolute/universe ppc64el python3-pandas all 2.3.3+dfsg-1ubuntu1 [2948 kB] 107s Get:41 http://ftpmaster.internal/ubuntu resolute/universe ppc64el python3-pluggy all 1.6.0-1 [21.0 kB] 107s Get:42 http://ftpmaster.internal/ubuntu resolute/universe ppc64el python3-pytest all 8.3.5-2 [252 kB] 107s Get:43 http://ftpmaster.internal/ubuntu resolute/main ppc64el python3-tornado ppc64el 6.5.2-3 [305 kB] 107s Fetched 29.2 MB in 5s (5905 kB/s) 107s Selecting previously unselected package fonts-lato. 108s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81022 files and directories currently installed.) 108s Preparing to unpack .../00-fonts-lato_2.015-1_all.deb ... 108s Unpacking fonts-lato (2.015-1) ... 108s Selecting previously unselected package python3-numpy-dev:ppc64el. 108s Preparing to unpack .../01-python3-numpy-dev_1%3a2.2.4+ds-1ubuntu1_ppc64el.deb ... 108s Unpacking python3-numpy-dev:ppc64el (1:2.2.4+ds-1ubuntu1) ... 108s Selecting previously unselected package libblas3:ppc64el. 108s Preparing to unpack .../02-libblas3_3.12.1-7_ppc64el.deb ... 108s Unpacking libblas3:ppc64el (3.12.1-7) ... 108s Selecting previously unselected package libgfortran5:ppc64el. 108s Preparing to unpack .../03-libgfortran5_15.2.0-7ubuntu1_ppc64el.deb ... 108s Unpacking libgfortran5:ppc64el (15.2.0-7ubuntu1) ... 108s Selecting previously unselected package liblapack3:ppc64el. 108s Preparing to unpack .../04-liblapack3_3.12.1-7_ppc64el.deb ... 108s Unpacking liblapack3:ppc64el (3.12.1-7) ... 108s Selecting previously unselected package python3-numpy. 108s Preparing to unpack .../05-python3-numpy_1%3a2.2.4+ds-1ubuntu1_ppc64el.deb ... 108s Unpacking python3-numpy (1:2.2.4+ds-1ubuntu1) ... 108s Selecting previously unselected package fonts-font-awesome. 108s Preparing to unpack .../06-fonts-font-awesome_5.0.10+really4.7.0~dfsg-4.1_all.deb ... 108s Unpacking fonts-font-awesome (5.0.10+really4.7.0~dfsg-4.1) ... 108s Selecting previously unselected package fonts-glyphicons-halflings. 108s Preparing to unpack .../07-fonts-glyphicons-halflings_1.009~3.4.1+dfsg-6_all.deb ... 108s Unpacking fonts-glyphicons-halflings (1.009~3.4.1+dfsg-6) ... 108s Selecting previously unselected package libjs-bootstrap. 108s Preparing to unpack .../08-libjs-bootstrap_3.4.1+dfsg-6_all.deb ... 108s Unpacking libjs-bootstrap (3.4.1+dfsg-6) ... 108s Selecting previously unselected package libjs-jquery. 108s Preparing to unpack .../09-libjs-jquery_3.6.1+dfsg+~3.5.14-1_all.deb ... 108s Unpacking libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 108s Selecting previously unselected package libjs-underscore. 108s Preparing to unpack .../10-libjs-underscore_1.13.4~dfsg+~1.11.4-3_all.deb ... 108s Unpacking libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... 108s Selecting previously unselected package libjs-sphinxdoc. 108s Preparing to unpack .../11-libjs-sphinxdoc_8.2.3-1ubuntu2_all.deb ... 108s Unpacking libjs-sphinxdoc (8.2.3-1ubuntu2) ... 108s Selecting previously unselected package node-html5shiv. 108s Preparing to unpack .../12-node-html5shiv_3.7.3+dfsg-5_all.deb ... 108s Unpacking node-html5shiv (3.7.3+dfsg-5) ... 108s Selecting previously unselected package python3-aiohappyeyeballs. 108s Preparing to unpack .../13-python3-aiohappyeyeballs_2.6.1-2_all.deb ... 108s Unpacking python3-aiohappyeyeballs (2.6.1-2) ... 108s Selecting previously unselected package python3-multidict. 108s Preparing to unpack .../14-python3-multidict_6.4.3-1build1_ppc64el.deb ... 108s Unpacking python3-multidict (6.4.3-1build1) ... 108s Selecting previously unselected package python3-propcache. 108s Preparing to unpack .../15-python3-propcache_0.3.1-1build1_ppc64el.deb ... 108s Unpacking python3-propcache (0.3.1-1build1) ... 108s Selecting previously unselected package python3-yarl. 108s Preparing to unpack .../16-python3-yarl_1.22.0-1_ppc64el.deb ... 108s Unpacking python3-yarl (1.22.0-1) ... 108s Selecting previously unselected package python3-async-timeout. 108s Preparing to unpack .../17-python3-async-timeout_5.0.1-1_all.deb ... 108s Unpacking python3-async-timeout (5.0.1-1) ... 108s Selecting previously unselected package python3-frozenlist. 108s Preparing to unpack .../18-python3-frozenlist_1.8.0-1_ppc64el.deb ... 109s Unpacking python3-frozenlist (1.8.0-1) ... 109s Selecting previously unselected package python3-aiosignal. 109s Preparing to unpack .../19-python3-aiosignal_1.4.0-1_all.deb ... 109s Unpacking python3-aiosignal (1.4.0-1) ... 109s Selecting previously unselected package python3-aiohttp. 109s Preparing to unpack .../20-python3-aiohttp_3.13.1-1_ppc64el.deb ... 109s Unpacking python3-aiohttp (3.13.1-1) ... 109s Selecting previously unselected package python3-all. 109s Preparing to unpack .../21-python3-all_3.13.7-1_ppc64el.deb ... 109s Unpacking python3-all (3.13.7-1) ... 109s Selecting previously unselected package python3-click. 109s Preparing to unpack .../22-python3-click_8.2.0+0.really.8.1.8-1_all.deb ... 109s Unpacking python3-click (8.2.0+0.really.8.1.8-1) ... 109s Selecting previously unselected package python3-cloudpickle. 109s Preparing to unpack .../23-python3-cloudpickle_3.1.1-1_all.deb ... 109s Unpacking python3-cloudpickle (3.1.1-1) ... 109s Selecting previously unselected package python3-fsspec. 109s Preparing to unpack .../24-python3-fsspec_2025.3.2-1ubuntu1_all.deb ... 109s Unpacking python3-fsspec (2025.3.2-1ubuntu1) ... 109s Selecting previously unselected package python3-toolz. 109s Preparing to unpack .../25-python3-toolz_1.0.0-2_all.deb ... 109s Unpacking python3-toolz (1.0.0-2) ... 109s Selecting previously unselected package python3-locket. 109s Preparing to unpack .../26-python3-locket_1.0.0-2_all.deb ... 109s Unpacking python3-locket (1.0.0-2) ... 109s Selecting previously unselected package python3-partd. 109s Preparing to unpack .../27-python3-partd_1.4.2-1_all.deb ... 109s Unpacking python3-partd (1.4.2-1) ... 109s Selecting previously unselected package python3-dask. 109s Preparing to unpack .../28-python3-dask_2024.12.1+dfsg-2_all.deb ... 109s Unpacking python3-dask (2024.12.1+dfsg-2) ... 109s Selecting previously unselected package python3-entrypoints. 109s Preparing to unpack .../29-python3-entrypoints_0.4-3_all.deb ... 109s Unpacking python3-entrypoints (0.4-3) ... 109s Selecting previously unselected package python3-iniconfig. 109s Preparing to unpack .../30-python3-iniconfig_2.1.0-1_all.deb ... 109s Unpacking python3-iniconfig (2.1.0-1) ... 109s Selecting previously unselected package python3-msgpack. 109s Preparing to unpack .../31-python3-msgpack_1.0.3-3build5_ppc64el.deb ... 109s Unpacking python3-msgpack (1.0.3-3build5) ... 109s Selecting previously unselected package python3-platformdirs. 109s Preparing to unpack .../32-python3-platformdirs_4.3.7-1_all.deb ... 109s Unpacking python3-platformdirs (4.3.7-1) ... 109s Selecting previously unselected package python3-intake. 109s Preparing to unpack .../33-python3-intake_0.6.6-4_ppc64el.deb ... 109s Unpacking python3-intake (0.6.6-4) ... 109s Selecting previously unselected package sphinx-rtd-theme-common. 109s Preparing to unpack .../34-sphinx-rtd-theme-common_3.0.2+dfsg-3_all.deb ... 109s Unpacking sphinx-rtd-theme-common (3.0.2+dfsg-3) ... 109s Selecting previously unselected package python3-intake-doc. 109s Preparing to unpack .../35-python3-intake-doc_0.6.6-4_all.deb ... 109s Unpacking python3-intake-doc (0.6.6-4) ... 109s Selecting previously unselected package python3-msgpack-numpy. 109s Preparing to unpack .../36-python3-msgpack-numpy_0.4.8-1_all.deb ... 109s Unpacking python3-msgpack-numpy (0.4.8-1) ... 109s Selecting previously unselected package python3-pytz. 109s Preparing to unpack .../37-python3-pytz_2025.2-4_all.deb ... 109s Unpacking python3-pytz (2025.2-4) ... 109s Selecting previously unselected package python3-pandas-lib:ppc64el. 109s Preparing to unpack .../38-python3-pandas-lib_2.3.3+dfsg-1ubuntu1_ppc64el.deb ... 109s Unpacking python3-pandas-lib:ppc64el (2.3.3+dfsg-1ubuntu1) ... 109s Selecting previously unselected package python3-pandas. 109s Preparing to unpack .../39-python3-pandas_2.3.3+dfsg-1ubuntu1_all.deb ... 109s Unpacking python3-pandas (2.3.3+dfsg-1ubuntu1) ... 109s Selecting previously unselected package python3-pluggy. 109s Preparing to unpack .../40-python3-pluggy_1.6.0-1_all.deb ... 109s Unpacking python3-pluggy (1.6.0-1) ... 109s Selecting previously unselected package python3-pytest. 109s Preparing to unpack .../41-python3-pytest_8.3.5-2_all.deb ... 109s Unpacking python3-pytest (8.3.5-2) ... 109s Selecting previously unselected package python3-tornado. 109s Preparing to unpack .../42-python3-tornado_6.5.2-3_ppc64el.deb ... 109s Unpacking python3-tornado (6.5.2-3) ... 109s Setting up python3-entrypoints (0.4-3) ... 110s Setting up python3-iniconfig (2.1.0-1) ... 110s Setting up python3-tornado (6.5.2-3) ... 110s Setting up fonts-lato (2.015-1) ... 110s Setting up fonts-glyphicons-halflings (1.009~3.4.1+dfsg-6) ... 110s Setting up python3-fsspec (2025.3.2-1ubuntu1) ... 110s Setting up node-html5shiv (3.7.3+dfsg-5) ... 110s Setting up python3-all (3.13.7-1) ... 110s Setting up python3-pytz (2025.2-4) ... 110s Setting up python3-click (8.2.0+0.really.8.1.8-1) ... 111s Setting up python3-platformdirs (4.3.7-1) ... 111s Setting up python3-multidict (6.4.3-1build1) ... 111s Setting up python3-cloudpickle (3.1.1-1) ... 111s Setting up python3-frozenlist (1.8.0-1) ... 111s Setting up python3-aiosignal (1.4.0-1) ... 111s Setting up python3-async-timeout (5.0.1-1) ... 111s Setting up libblas3:ppc64el (3.12.1-7) ... 111s update-alternatives: using /usr/lib/powerpc64le-linux-gnu/blas/libblas.so.3 to provide /usr/lib/powerpc64le-linux-gnu/libblas.so.3 (libblas.so.3-powerpc64le-linux-gnu) in auto mode 111s Setting up python3-numpy-dev:ppc64el (1:2.2.4+ds-1ubuntu1) ... 111s Setting up python3-aiohappyeyeballs (2.6.1-2) ... 111s Setting up libgfortran5:ppc64el (15.2.0-7ubuntu1) ... 111s Setting up python3-pluggy (1.6.0-1) ... 111s Setting up python3-propcache (0.3.1-1build1) ... 111s Setting up python3-toolz (1.0.0-2) ... 112s Setting up python3-msgpack (1.0.3-3build5) ... 112s Setting up libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 112s Setting up python3-locket (1.0.0-2) ... 112s Setting up python3-yarl (1.22.0-1) ... 112s Setting up fonts-font-awesome (5.0.10+really4.7.0~dfsg-4.1) ... 112s Setting up sphinx-rtd-theme-common (3.0.2+dfsg-3) ... 112s Setting up libjs-bootstrap (3.4.1+dfsg-6) ... 112s Setting up libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... 112s Setting up python3-partd (1.4.2-1) ... 112s Setting up liblapack3:ppc64el (3.12.1-7) ... 112s update-alternatives: using /usr/lib/powerpc64le-linux-gnu/lapack/liblapack.so.3 to provide /usr/lib/powerpc64le-linux-gnu/liblapack.so.3 (liblapack.so.3-powerpc64le-linux-gnu) in auto mode 112s Setting up python3-pytest (8.3.5-2) ... 112s Setting up python3-aiohttp (3.13.1-1) ... 113s Setting up python3-dask (2024.12.1+dfsg-2) ... 114s Setting up python3-numpy (1:2.2.4+ds-1ubuntu1) ... 116s Setting up libjs-sphinxdoc (8.2.3-1ubuntu2) ... 116s Setting up python3-intake (0.6.6-4) ... 116s Setting up python3-msgpack-numpy (0.4.8-1) ... 116s Setting up python3-pandas-lib:ppc64el (2.3.3+dfsg-1ubuntu1) ... 116s Setting up python3-intake-doc (0.6.6-4) ... 116s Setting up python3-pandas (2.3.3+dfsg-1ubuntu1) ... 120s Processing triggers for man-db (2.13.1-1) ... 121s Processing triggers for libc-bin (2.42-2ubuntu2) ... 122s autopkgtest [19:02:48]: test run-unit-test: [----------------------- 123s ============================= test session starts ============================== 123s platform linux -- Python 3.13.9, pytest-8.3.5, pluggy-1.6.0 -- /usr/bin/python3.13 123s cachedir: .pytest_cache 123s rootdir: /tmp/autopkgtest.sDpKou/build.wv0/src 123s plugins: typeguard-4.4.2 125s collecting ... collected 424 items / 11 skipped 125s 125s intake/auth/tests/test_auth.py::test_get PASSED [ 0%] 125s intake/auth/tests/test_auth.py::test_base PASSED [ 0%] 125s intake/auth/tests/test_auth.py::test_base_client PASSED [ 0%] 125s intake/auth/tests/test_auth.py::test_base_get_case_insensitive PASSED [ 0%] 125s intake/auth/tests/test_auth.py::test_secret PASSED [ 1%] 125s intake/auth/tests/test_auth.py::test_secret_client PASSED [ 1%] 125s intake/catalog/tests/test_alias.py::test_simple PASSED [ 1%] 125s intake/catalog/tests/test_alias.py::test_mapping PASSED [ 1%] 128s intake/catalog/tests/test_auth_integration.py::test_secret_auth PASSED [ 2%] 131s intake/catalog/tests/test_auth_integration.py::test_secret_auth_fail PASSED [ 2%] 131s intake/catalog/tests/test_caching_integration.py::test_load_csv PASSED [ 2%] 131s intake/catalog/tests/test_caching_integration.py::test_list_of_files PASSED [ 2%] 131s intake/catalog/tests/test_caching_integration.py::test_bad_type_cache PASSED [ 3%] 131s intake/catalog/tests/test_caching_integration.py::test_load_textfile FAILED [ 3%] 131s intake/catalog/tests/test_caching_integration.py::test_load_arr PASSED [ 3%] 131s intake/catalog/tests/test_caching_integration.py::test_regex[test_no_regex] PASSED [ 3%] 131s intake/catalog/tests/test_caching_integration.py::test_regex[test_regex_no_match] PASSED [ 4%] 131s intake/catalog/tests/test_caching_integration.py::test_regex[test_regex_partial_match] PASSED [ 4%] 131s intake/catalog/tests/test_caching_integration.py::test_get_metadata PASSED [ 4%] 131s intake/catalog/tests/test_caching_integration.py::test_clear_cache PASSED [ 4%] 131s intake/catalog/tests/test_caching_integration.py::test_clear_cache_bad_metadata PASSED [ 4%] 131s intake/catalog/tests/test_caching_integration.py::test_clear_all PASSED [ 5%] 131s intake/catalog/tests/test_caching_integration.py::test_second_load PASSED [ 5%] 132s intake/catalog/tests/test_caching_integration.py::test_second_load_timestamp PASSED [ 5%] 132s intake/catalog/tests/test_caching_integration.py::test_second_load_refresh PASSED [ 5%] 132s intake/catalog/tests/test_caching_integration.py::test_multiple_cache PASSED [ 6%] 132s intake/catalog/tests/test_caching_integration.py::test_disable_caching PASSED [ 6%] 132s intake/catalog/tests/test_caching_integration.py::test_ds_set_cache_dir PASSED [ 6%] 132s intake/catalog/tests/test_catalog_save.py::test_catalog_description PASSED [ 6%] 132s intake/catalog/tests/test_core.py::test_no_entry PASSED [ 7%] 132s intake/catalog/tests/test_core.py::test_regression PASSED [ 7%] 132s intake/catalog/tests/test_default.py::test_load PASSED [ 7%] 132s intake/catalog/tests/test_discovery.py::test_catalog_discovery PASSED [ 7%] 132s intake/catalog/tests/test_discovery.py::test_deferred_import PASSED [ 8%] 132s intake/catalog/tests/test_gui.py::test_cat_no_panel_does_not_raise_errors PASSED [ 8%] 132s intake/catalog/tests/test_gui.py::test_cat_no_panel_display_gui PASSED [ 8%] 132s intake/catalog/tests/test_gui.py::test_cat_gui SKIPPED (could not im...) [ 8%] 132s intake/catalog/tests/test_gui.py::test_entry_no_panel_does_not_raise_errors PASSED [ 8%] 132s intake/catalog/tests/test_gui.py::test_entry_no_panel_display_gui PASSED [ 9%] 132s intake/catalog/tests/test_gui.py::test_entry_gui SKIPPED (could not ...) [ 9%] 132s intake/catalog/tests/test_local.py::test_local_catalog PASSED [ 9%] 132s intake/catalog/tests/test_local.py::test_get_items PASSED [ 9%] 132s intake/catalog/tests/test_local.py::test_nested FAILED [ 10%] 132s intake/catalog/tests/test_local.py::test_nested_gets_name_from_super PASSED [ 10%] 132s intake/catalog/tests/test_local.py::test_hash PASSED [ 10%] 132s intake/catalog/tests/test_local.py::test_getitem PASSED [ 10%] 132s intake/catalog/tests/test_local.py::test_source_plugin_config PASSED [ 11%] 132s intake/catalog/tests/test_local.py::test_metadata PASSED [ 11%] 132s intake/catalog/tests/test_local.py::test_use_source_plugin_from_config PASSED [ 11%] 132s intake/catalog/tests/test_local.py::test_get_dir PASSED [ 11%] 132s intake/catalog/tests/test_local.py::test_entry_dir_function PASSED [ 12%] 132s intake/catalog/tests/test_local.py::test_user_parameter_default_value[bool-False] PASSED [ 12%] 132s intake/catalog/tests/test_local.py::test_user_parameter_default_value[datetime-expected1] PASSED [ 12%] 132s intake/catalog/tests/test_local.py::test_user_parameter_default_value[float-0.0] PASSED [ 12%] 132s intake/catalog/tests/test_local.py::test_user_parameter_default_value[int-0] PASSED [ 12%] 132s intake/catalog/tests/test_local.py::test_user_parameter_default_value[list-expected4] PASSED [ 13%] 132s intake/catalog/tests/test_local.py::test_user_parameter_default_value[str-] PASSED [ 13%] 132s intake/catalog/tests/test_local.py::test_user_parameter_default_value[unicode-] PASSED [ 13%] 132s intake/catalog/tests/test_local.py::test_user_parameter_repr PASSED [ 13%] 132s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[bool-true-True] PASSED [ 14%] 132s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[bool-0-False] PASSED [ 14%] 132s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[datetime-given2-expected2] PASSED [ 14%] 132s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[datetime-2018-01-01 12:34AM-expected3] PASSED [ 14%] 132s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[datetime-1234567890000000000-expected4] PASSED [ 15%] 132s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[float-3.14-3.14] PASSED [ 15%] 132s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[int-1-1] PASSED [ 15%] 132s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[list-given7-expected7] PASSED [ 15%] 132s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[str-1-1] PASSED [ 16%] 132s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[unicode-foo-foo] PASSED [ 16%] 132s intake/catalog/tests/test_local.py::test_user_parameter_coerce_special_datetime[now] PASSED [ 16%] 132s intake/catalog/tests/test_local.py::test_user_parameter_coerce_special_datetime[today] PASSED [ 16%] 132s intake/catalog/tests/test_local.py::test_user_parameter_coerce_min[float-100.0-100.0] PASSED [ 16%] 132s intake/catalog/tests/test_local.py::test_user_parameter_coerce_min[int-20-20] PASSED [ 17%] 132s intake/catalog/tests/test_local.py::test_user_parameter_coerce_min[int-20.0-20] PASSED [ 17%] 132s intake/catalog/tests/test_local.py::test_user_parameter_coerce_max[float-100.0-100.0] PASSED [ 17%] 132s intake/catalog/tests/test_local.py::test_user_parameter_coerce_max[int-20-20] PASSED [ 17%] 132s intake/catalog/tests/test_local.py::test_user_parameter_coerce_max[int-20.0-20] PASSED [ 18%] 132s intake/catalog/tests/test_local.py::test_user_parameter_coerce_allowed[float-given0-expected0] PASSED [ 18%] 132s intake/catalog/tests/test_local.py::test_user_parameter_coerce_allowed[int-given1-expected1] PASSED [ 18%] 132s intake/catalog/tests/test_local.py::test_user_parameter_validation_range PASSED [ 18%] 132s intake/catalog/tests/test_local.py::test_user_parameter_validation_allowed PASSED [ 19%] 132s intake/catalog/tests/test_local.py::test_user_pars_list PASSED [ 19%] 132s intake/catalog/tests/test_local.py::test_user_pars_mlist PASSED [ 19%] 132s intake/catalog/tests/test_local.py::test_parser_validation_error[catalog_non_dict] PASSED [ 19%] 132s intake/catalog/tests/test_local.py::test_parser_validation_error[data_source_missing] PASSED [ 20%] 132s intake/catalog/tests/test_local.py::test_parser_validation_error[data_source_name_non_string] PASSED [ 20%] 132s intake/catalog/tests/test_local.py::test_parser_validation_error[data_source_non_dict] PASSED [ 20%] 132s intake/catalog/tests/test_local.py::test_parser_validation_error[data_source_value_non_dict] PASSED [ 20%] 132s intake/catalog/tests/test_local.py::test_parser_validation_error[params_missing_required] PASSED [ 20%] 132s intake/catalog/tests/test_local.py::test_parser_validation_error[params_name_non_string] PASSED [ 21%] 132s intake/catalog/tests/test_local.py::test_parser_validation_error[params_non_dict] PASSED [ 21%] 132s intake/catalog/tests/test_local.py::test_parser_validation_error[params_value_bad_choice] PASSED [ 21%] 132s intake/catalog/tests/test_local.py::test_parser_validation_error[params_value_bad_type] PASSED [ 21%] 132s intake/catalog/tests/test_local.py::test_parser_validation_error[params_value_non_dict] PASSED [ 22%] 132s intake/catalog/tests/test_local.py::test_parser_validation_error[plugins_non_dict] PASSED [ 22%] 132s intake/catalog/tests/test_local.py::test_parser_validation_error[plugins_source_missing] PASSED [ 22%] 132s intake/catalog/tests/test_local.py::test_parser_validation_error[plugins_source_missing_key] PASSED [ 22%] 132s intake/catalog/tests/test_local.py::test_parser_validation_error[plugins_source_non_dict] PASSED [ 23%] 132s intake/catalog/tests/test_local.py::test_parser_validation_error[plugins_source_non_list] PASSED [ 23%] 132s intake/catalog/tests/test_local.py::test_parser_obsolete_error[obsolete_data_source_list] PASSED [ 23%] 132s intake/catalog/tests/test_local.py::test_parser_obsolete_error[obsolete_params_list] PASSED [ 23%] 132s intake/catalog/tests/test_local.py::test_union_catalog PASSED [ 24%] 132s intake/catalog/tests/test_local.py::test_persist_local_cat PASSED [ 24%] 132s intake/catalog/tests/test_local.py::test_empty_catalog PASSED [ 24%] 132s intake/catalog/tests/test_local.py::test_nonexistent_error PASSED [ 24%] 132s intake/catalog/tests/test_local.py::test_duplicate_data_sources PASSED [ 25%] 132s intake/catalog/tests/test_local.py::test_duplicate_parameters PASSED [ 25%] 133s intake/catalog/tests/test_local.py::test_catalog_file_removal PASSED [ 25%] 133s intake/catalog/tests/test_local.py::test_flatten_duplicate_error PASSED [ 25%] 133s intake/catalog/tests/test_local.py::test_multi_cat_names PASSED [ 25%] 133s intake/catalog/tests/test_local.py::test_name_of_builtin PASSED [ 26%] 133s intake/catalog/tests/test_local.py::test_cat_with_declared_name PASSED [ 26%] 133s intake/catalog/tests/test_local.py::test_cat_with_no_declared_name_gets_name_from_dir_if_file_named_catalog PASSED [ 26%] 133s intake/catalog/tests/test_local.py::test_default_expansions PASSED [ 26%] 133s intake/catalog/tests/test_local.py::test_remote_cat PASSED [ 27%] 133s intake/catalog/tests/test_local.py::test_multi_plugins PASSED [ 27%] 133s intake/catalog/tests/test_local.py::test_no_plugins PASSED [ 27%] 133s intake/catalog/tests/test_local.py::test_explicit_entry_driver PASSED [ 27%] 133s intake/catalog/tests/test_local.py::test_getitem_and_getattr PASSED [ 28%] 133s intake/catalog/tests/test_local.py::test_dot_names PASSED [ 28%] 133s intake/catalog/tests/test_local.py::test_listing PASSED [ 28%] 133s intake/catalog/tests/test_local.py::test_dict_save PASSED [ 28%] 133s intake/catalog/tests/test_local.py::test_dict_save_complex PASSED [ 29%] 133s intake/catalog/tests/test_local.py::test_dict_adddel PASSED [ 29%] 133s intake/catalog/tests/test_local.py::test_filter PASSED [ 29%] 133s intake/catalog/tests/test_local.py::test_from_dict_with_data_source PASSED [ 29%] 133s intake/catalog/tests/test_local.py::test_no_instance PASSED [ 29%] 133s intake/catalog/tests/test_local.py::test_fsspec_integration PASSED [ 30%] 133s intake/catalog/tests/test_local.py::test_cat_add PASSED [ 30%] 133s intake/catalog/tests/test_local.py::test_no_entries_items PASSED [ 30%] 133s intake/catalog/tests/test_local.py::test_cat_dictlike PASSED [ 30%] 133s intake/catalog/tests/test_local.py::test_inherit_params SKIPPED (tes...) [ 31%] 133s intake/catalog/tests/test_local.py::test_runtime_overwrite_params SKIPPED [ 31%] 133s intake/catalog/tests/test_local.py::test_local_param_overwrites SKIPPED [ 31%] 133s intake/catalog/tests/test_local.py::test_local_and_global_params SKIPPED [ 31%] 133s intake/catalog/tests/test_local.py::test_search_inherit_params SKIPPED [ 32%] 133s intake/catalog/tests/test_local.py::test_multiple_cats_params SKIPPED [ 32%] 133s intake/catalog/tests/test_parameters.py::test_simplest PASSED [ 32%] 133s intake/catalog/tests/test_parameters.py::test_cache_default_source PASSED [ 32%] 133s intake/catalog/tests/test_parameters.py::test_parameter_default PASSED [ 33%] 133s intake/catalog/tests/test_parameters.py::test_maybe_default_from_env PASSED [ 33%] 133s intake/catalog/tests/test_parameters.py::test_up_override_and_render PASSED [ 33%] 133s intake/catalog/tests/test_parameters.py::test_user_explicit_override PASSED [ 33%] 133s intake/catalog/tests/test_parameters.py::test_auto_env_expansion PASSED [ 33%] 133s intake/catalog/tests/test_parameters.py::test_validate_up PASSED [ 34%] 133s intake/catalog/tests/test_parameters.py::test_validate_par PASSED [ 34%] 133s intake/catalog/tests/test_parameters.py::test_mlist_parameter PASSED [ 34%] 133s intake/catalog/tests/test_parameters.py::test_explicit_overrides PASSED [ 34%] 133s intake/catalog/tests/test_parameters.py::test_extra_arg PASSED [ 35%] 133s intake/catalog/tests/test_parameters.py::test_unknown PASSED [ 35%] 133s intake/catalog/tests/test_parameters.py::test_catalog_passthrough PASSED [ 35%] 133s intake/catalog/tests/test_persist.py::test_idempotent SKIPPED (could...) [ 35%] 133s intake/catalog/tests/test_persist.py::test_parquet SKIPPED (could no...) [ 36%] 136s intake/catalog/tests/test_reload_integration.py::test_reload_updated_config PASSED [ 36%] 138s intake/catalog/tests/test_reload_integration.py::test_reload_updated_directory PASSED [ 36%] 140s intake/catalog/tests/test_reload_integration.py::test_reload_missing_remote_directory PASSED [ 36%] 143s intake/catalog/tests/test_reload_integration.py::test_reload_missing_local_directory PASSED [ 37%] 144s intake/catalog/tests/test_remote_integration.py::test_info_describe FAILED [ 37%] 144s intake/catalog/tests/test_remote_integration.py::test_bad_url PASSED [ 37%] 144s intake/catalog/tests/test_remote_integration.py::test_metadata PASSED [ 37%] 144s intake/catalog/tests/test_remote_integration.py::test_nested_remote PASSED [ 37%] 144s intake/catalog/tests/test_remote_integration.py::test_remote_direct FAILED [ 38%] 144s intake/catalog/tests/test_remote_integration.py::test_entry_metadata PASSED [ 38%] 144s intake/catalog/tests/test_remote_integration.py::test_unknown_source PASSED [ 38%] 144s intake/catalog/tests/test_remote_integration.py::test_remote_datasource_interface FAILED [ 38%] 144s intake/catalog/tests/test_remote_integration.py::test_environment_evaluation PASSED [ 39%] 144s intake/catalog/tests/test_remote_integration.py::test_read FAILED [ 39%] 144s intake/catalog/tests/test_remote_integration.py::test_read_direct PASSED [ 39%] 144s intake/catalog/tests/test_remote_integration.py::test_read_chunks FAILED [ 39%] 144s intake/catalog/tests/test_remote_integration.py::test_read_partition FAILED [ 40%] 144s intake/catalog/tests/test_remote_integration.py::test_close FAILED [ 40%] 144s intake/catalog/tests/test_remote_integration.py::test_with FAILED [ 40%] 144s intake/catalog/tests/test_remote_integration.py::test_pickle FAILED [ 40%] 144s intake/catalog/tests/test_remote_integration.py::test_to_dask FAILED [ 41%] 145s intake/catalog/tests/test_remote_integration.py::test_remote_env PASSED [ 41%] 145s intake/catalog/tests/test_remote_integration.py::test_remote_sequence FAILED [ 41%] 145s intake/catalog/tests/test_remote_integration.py::test_remote_arr PASSED [ 41%] 145s intake/catalog/tests/test_remote_integration.py::test_pagination PASSED [ 41%] 145s intake/catalog/tests/test_remote_integration.py::test_dir FAILED [ 42%] 145s intake/catalog/tests/test_remote_integration.py::test_getitem_and_getattr PASSED [ 42%] 145s intake/catalog/tests/test_remote_integration.py::test_search PASSED [ 42%] 145s intake/catalog/tests/test_remote_integration.py::test_access_subcatalog PASSED [ 42%] 145s intake/catalog/tests/test_remote_integration.py::test_len PASSED [ 43%] 146s intake/catalog/tests/test_remote_integration.py::test_datetime PASSED [ 43%] 146s intake/catalog/tests/test_utils.py::test_expand_templates PASSED [ 43%] 146s intake/catalog/tests/test_utils.py::test_expand_nested_template PASSED [ 43%] 146s intake/catalog/tests/test_utils.py::test_coerce_datetime[None-expected0] PASSED [ 44%] 146s intake/catalog/tests/test_utils.py::test_coerce_datetime[1-expected1] PASSED [ 44%] 146s intake/catalog/tests/test_utils.py::test_coerce_datetime[1988-02-24T13:37+0100-expected2] PASSED [ 44%] 146s intake/catalog/tests/test_utils.py::test_coerce_datetime[test_input3-expected3] PASSED [ 44%] 146s intake/catalog/tests/test_utils.py::test_flatten PASSED [ 45%] 146s intake/catalog/tests/test_utils.py::test_coerce[1-int-1_0] PASSED [ 45%] 146s intake/catalog/tests/test_utils.py::test_coerce[1-int-1_1] PASSED [ 45%] 146s intake/catalog/tests/test_utils.py::test_coerce[1-str-1] PASSED [ 45%] 146s intake/catalog/tests/test_utils.py::test_coerce[value3-list-expected3] PASSED [ 45%] 146s intake/catalog/tests/test_utils.py::test_coerce[value4-list-expected4] PASSED [ 46%] 146s intake/catalog/tests/test_utils.py::test_coerce[value5-list[str]-expected5] PASSED [ 46%] 146s intake/cli/client/tests/test_cache.py::test_help PASSED [ 46%] 147s intake/cli/client/tests/test_cache.py::test_list_keys PASSED [ 46%] 148s intake/cli/client/tests/test_cache.py::test_precache PASSED [ 47%] 148s intake/cli/client/tests/test_cache.py::test_clear_all PASSED [ 47%] 148s intake/cli/client/tests/test_cache.py::test_clear_one PASSED [ 47%] 148s intake/cli/client/tests/test_cache.py::test_usage PASSED [ 47%] 149s intake/cli/client/tests/test_conf.py::test_reset PASSED [ 48%] 149s intake/cli/client/tests/test_conf.py::test_info PASSED [ 48%] 149s intake/cli/client/tests/test_conf.py::test_defaults PASSED [ 48%] 149s intake/cli/client/tests/test_conf.py::test_get PASSED [ 48%] 150s intake/cli/client/tests/test_conf.py::test_log_level PASSED [ 49%] 150s intake/cli/client/tests/test_local_integration.py::test_list PASSED [ 49%] 150s intake/cli/client/tests/test_local_integration.py::test_full_list PASSED [ 49%] 151s intake/cli/client/tests/test_local_integration.py::test_describe PASSED [ 49%] 151s intake/cli/client/tests/test_local_integration.py::test_exists_pass PASSED [ 50%] 151s intake/cli/client/tests/test_local_integration.py::test_exists_fail PASSED [ 50%] 152s intake/cli/client/tests/test_local_integration.py::test_discover FAILED [ 50%] 153s intake/cli/client/tests/test_local_integration.py::test_get_pass FAILED [ 50%] 153s intake/cli/client/tests/test_local_integration.py::test_get_fail PASSED [ 50%] 153s intake/cli/client/tests/test_local_integration.py::test_example PASSED [ 51%] 153s intake/cli/server/tests/test_serializer.py::test_dataframe[ser0] SKIPPED [ 51%] 153s intake/cli/server/tests/test_serializer.py::test_dataframe[ser1] SKIPPED [ 51%] 153s intake/cli/server/tests/test_serializer.py::test_dataframe[ser2] SKIPPED [ 51%] 153s intake/cli/server/tests/test_serializer.py::test_ndarray[ser0] PASSED [ 52%] 153s intake/cli/server/tests/test_serializer.py::test_ndarray[ser1] PASSED [ 52%] 153s intake/cli/server/tests/test_serializer.py::test_ndarray[ser2] PASSED [ 52%] 153s intake/cli/server/tests/test_serializer.py::test_python[ser0] PASSED [ 52%] 153s intake/cli/server/tests/test_serializer.py::test_python[ser1] PASSED [ 53%] 153s intake/cli/server/tests/test_serializer.py::test_python[ser2] PASSED [ 53%] 153s intake/cli/server/tests/test_serializer.py::test_compression_roundtrip[comp0] PASSED [ 53%] 153s intake/cli/server/tests/test_serializer.py::test_compression_roundtrip[comp1] PASSED [ 53%] 153s intake/cli/server/tests/test_serializer.py::test_none_compress PASSED [ 54%] 153s intake/cli/server/tests/test_server.py::TestServerV1Info::test_info PASSED [ 54%] 153s intake/cli/server/tests/test_server.py::TestServerV1Source::test_bad_action PASSED [ 54%] 153s intake/cli/server/tests/test_server.py::TestServerV1Source::test_idle_timer FAILED [ 54%] 153s intake/cli/server/tests/test_server.py::TestServerV1Source::test_no_format FAILED [ 54%] 153s intake/cli/server/tests/test_server.py::TestServerV1Source::test_open FAILED [ 55%] 153s intake/cli/server/tests/test_server.py::TestServerV1Source::test_open_direct PASSED [ 55%] 153s intake/cli/server/tests/test_server.py::TestServerV1Source::test_read_part_compressed SKIPPED [ 55%] 153s intake/cli/server/tests/test_server.py::TestServerV1Source::test_read_partition SKIPPED [ 55%] 154s intake/cli/server/tests/test_server.py::test_flatten_flag PASSED [ 56%] 154s intake/cli/server/tests/test_server.py::test_port_flag PASSED [ 56%] 154s intake/cli/tests/test_util.py::test_print_entry_info PASSED [ 56%] 154s intake/cli/tests/test_util.py::test_die PASSED [ 56%] 154s intake/cli/tests/test_util.py::Test_nice_join::test_default PASSED [ 57%] 154s intake/cli/tests/test_util.py::Test_nice_join::test_string_conjunction PASSED [ 57%] 154s intake/cli/tests/test_util.py::Test_nice_join::test_None_conjunction PASSED [ 57%] 154s intake/cli/tests/test_util.py::Test_nice_join::test_sep PASSED [ 57%] 154s intake/cli/tests/test_util.py::TestSubcommand::test_initialize_abstract PASSED [ 58%] 154s intake/cli/tests/test_util.py::TestSubcommand::test_invoke_abstract PASSED [ 58%] 154s intake/container/tests/test_generics.py::test_generic_dataframe PASSED [ 58%] 155s intake/container/tests/test_persist.py::test_store PASSED [ 58%] 155s intake/container/tests/test_persist.py::test_backtrack PASSED [ 58%] 155s intake/container/tests/test_persist.py::test_persist_with_nonnumeric_ttl_raises_error PASSED [ 59%] 155s intake/container/tests/test_persist.py::test_undask_persist SKIPPED [ 59%] 155s intake/interface/tests/test_init_gui.py::test_no_panel_does_not_raise_errors ERROR [ 59%] 155s intake/interface/tests/test_init_gui.py::test_no_panel_display_init_gui ERROR [ 59%] 155s intake/interface/tests/test_init_gui.py::test_display_init_gui ERROR [ 60%] 155s intake/source/tests/test_base.py::test_datasource_base_method_exceptions PASSED [ 60%] 155s intake/source/tests/test_base.py::test_name PASSED [ 60%] 155s intake/source/tests/test_base.py::test_datasource_base_context_manager PASSED [ 60%] 155s intake/source/tests/test_base.py::test_datasource_discover PASSED [ 61%] 155s intake/source/tests/test_base.py::test_datasource_read PASSED [ 61%] 155s intake/source/tests/test_base.py::test_datasource_read_chunked PASSED [ 61%] 155s intake/source/tests/test_base.py::test_datasource_read_partition PASSED [ 61%] 155s intake/source/tests/test_base.py::test_datasource_read_partition_out_of_range PASSED [ 62%] 155s intake/source/tests/test_base.py::test_datasource_to_dask PASSED [ 62%] 155s intake/source/tests/test_base.py::test_datasource_close PASSED [ 62%] 155s intake/source/tests/test_base.py::test_datasource_context_manager PASSED [ 62%] 155s intake/source/tests/test_base.py::test_datasource_pickle PASSED [ 62%] 155s intake/source/tests/test_base.py::test_datasource_python_discover PASSED [ 63%] 155s intake/source/tests/test_base.py::test_datasource_python_read PASSED [ 63%] 155s intake/source/tests/test_base.py::test_datasource_python_to_dask PASSED [ 63%] 155s intake/source/tests/test_base.py::test_yaml_method PASSED [ 63%] 155s intake/source/tests/test_base.py::test_alias_fail PASSED [ 64%] 155s intake/source/tests/test_base.py::test_reconfigure PASSED [ 64%] 155s intake/source/tests/test_base.py::test_import_name[data0] PASSED [ 64%] 155s intake/source/tests/test_base.py::test_import_name[data1] PASSED [ 64%] 155s intake/source/tests/test_base.py::test_import_name[data2] PASSED [ 65%] 155s intake/source/tests/test_base.py::test_import_name[data3] PASSED [ 65%] 155s intake/source/tests/test_base.py::test_import_name[data4] PASSED [ 65%] 155s intake/source/tests/test_cache.py::test_ensure_cache_dir PASSED [ 65%] 155s intake/source/tests/test_cache.py::test_munge_path PASSED [ 66%] 155s intake/source/tests/test_cache.py::test_hash PASSED [ 66%] 155s intake/source/tests/test_cache.py::test_path PASSED [ 66%] 155s intake/source/tests/test_cache.py::test_dir_cache PASSED [ 66%] 156s intake/source/tests/test_cache.py::test_compressed_cache PASSED [ 66%] 156s intake/source/tests/test_cache.py::test_filtered_compressed_cache PASSED [ 67%] 156s intake/source/tests/test_cache.py::test_cache_to_cat PASSED [ 67%] 156s intake/source/tests/test_cache.py::test_compressed_cache_infer PASSED [ 67%] 156s intake/source/tests/test_cache.py::test_compressions[tgz] PASSED [ 67%] 156s intake/source/tests/test_cache.py::test_compressions[tbz] PASSED [ 68%] 156s intake/source/tests/test_cache.py::test_compressions[tar] PASSED [ 68%] 156s intake/source/tests/test_cache.py::test_compressions[gz] PASSED [ 68%] 156s intake/source/tests/test_cache.py::test_compressions[bz] PASSED [ 68%] 156s intake/source/tests/test_cache.py::test_compressed_cache_bad PASSED [ 69%] 156s intake/source/tests/test_cache.py::test_dat SKIPPED (DAT not avaiable) [ 69%] 156s intake/source/tests/test_csv.py::test_csv_plugin PASSED [ 69%] 156s intake/source/tests/test_csv.py::test_open PASSED [ 69%] 156s intake/source/tests/test_csv.py::test_discover PASSED [ 70%] 156s intake/source/tests/test_csv.py::test_read PASSED [ 70%] 156s intake/source/tests/test_csv.py::test_read_list PASSED [ 70%] 156s intake/source/tests/test_csv.py::test_read_chunked PASSED [ 70%] 156s intake/source/tests/test_csv.py::test_read_pattern PASSED [ 70%] 156s intake/source/tests/test_csv.py::test_read_pattern_with_cache PASSED [ 71%] 156s intake/source/tests/test_csv.py::test_read_pattern_with_path_as_pattern_str PASSED [ 71%] 156s intake/source/tests/test_csv.py::test_read_partition PASSED [ 71%] 156s intake/source/tests/test_csv.py::test_to_dask PASSED [ 71%] 156s intake/source/tests/test_csv.py::test_plot SKIPPED (could not import...) [ 72%] 156s intake/source/tests/test_csv.py::test_close PASSED [ 72%] 156s intake/source/tests/test_csv.py::test_pickle PASSED [ 72%] 156s intake/source/tests/test_derived.py::test_columns PASSED [ 72%] 156s intake/source/tests/test_derived.py::test_df_transform PASSED [ 73%] 156s intake/source/tests/test_derived.py::test_barebones PASSED [ 73%] 156s intake/source/tests/test_derived.py::test_other_cat FAILED [ 73%] 156s intake/source/tests/test_discovery.py::test_package_scan PASSED [ 73%] 157s intake/source/tests/test_discovery.py::test_discover_cli PASSED [ 74%] 157s intake/source/tests/test_discovery.py::test_discover PASSED [ 74%] 157s intake/source/tests/test_discovery.py::test_enable_and_disable PASSED [ 74%] 157s intake/source/tests/test_discovery.py::test_discover_collision PASSED [ 74%] 157s intake/source/tests/test_json.py::test_jsonfile[None] PASSED [ 75%] 157s intake/source/tests/test_json.py::test_jsonfile[gzip] PASSED [ 75%] 157s intake/source/tests/test_json.py::test_jsonfile[bz2] PASSED [ 75%] 157s intake/source/tests/test_json.py::test_jsonfile_none[None] PASSED [ 75%] 157s intake/source/tests/test_json.py::test_jsonfile_none[gzip] PASSED [ 75%] 157s intake/source/tests/test_json.py::test_jsonfile_none[bz2] PASSED [ 76%] 157s intake/source/tests/test_json.py::test_jsonfile_discover[None] PASSED [ 76%] 157s intake/source/tests/test_json.py::test_jsonfile_discover[gzip] PASSED [ 76%] 157s intake/source/tests/test_json.py::test_jsonfile_discover[bz2] PASSED [ 76%] 157s intake/source/tests/test_json.py::test_jsonlfile[None] PASSED [ 77%] 157s intake/source/tests/test_json.py::test_jsonlfile[gzip] PASSED [ 77%] 157s intake/source/tests/test_json.py::test_jsonlfile[bz2] PASSED [ 77%] 157s intake/source/tests/test_json.py::test_jsonfilel_none[None] PASSED [ 77%] 157s intake/source/tests/test_json.py::test_jsonfilel_none[gzip] PASSED [ 78%] 157s intake/source/tests/test_json.py::test_jsonfilel_none[bz2] PASSED [ 78%] 157s intake/source/tests/test_json.py::test_jsonfilel_discover[None] PASSED [ 78%] 157s intake/source/tests/test_json.py::test_jsonfilel_discover[gzip] PASSED [ 78%] 157s intake/source/tests/test_json.py::test_jsonfilel_discover[bz2] PASSED [ 79%] 157s intake/source/tests/test_json.py::test_jsonl_head[None] PASSED [ 79%] 157s intake/source/tests/test_json.py::test_jsonl_head[gzip] PASSED [ 79%] 157s intake/source/tests/test_json.py::test_jsonl_head[bz2] PASSED [ 79%] 157s intake/source/tests/test_npy.py::test_one_file[shape0] PASSED [ 79%] 157s intake/source/tests/test_npy.py::test_one_file[shape1] PASSED [ 80%] 157s intake/source/tests/test_npy.py::test_one_file[shape2] PASSED [ 80%] 157s intake/source/tests/test_npy.py::test_one_file[shape3] PASSED [ 80%] 157s intake/source/tests/test_npy.py::test_one_file[shape4] PASSED [ 80%] 157s intake/source/tests/test_npy.py::test_multi_file[shape0] PASSED [ 81%] 157s intake/source/tests/test_npy.py::test_multi_file[shape1] PASSED [ 81%] 157s intake/source/tests/test_npy.py::test_multi_file[shape2] PASSED [ 81%] 157s intake/source/tests/test_npy.py::test_multi_file[shape3] PASSED [ 81%] 157s intake/source/tests/test_npy.py::test_multi_file[shape4] PASSED [ 82%] 157s intake/source/tests/test_npy.py::test_zarr_minimal SKIPPED (could no...) [ 82%] 157s intake/source/tests/test_text.py::test_textfiles PASSED [ 82%] 157s intake/source/tests/test_text.py::test_complex_text[None] PASSED [ 82%] 157s intake/source/tests/test_text.py::test_complex_text[gzip] PASSED [ 83%] 158s intake/source/tests/test_text.py::test_complex_text[bz2] PASSED [ 83%] 158s intake/source/tests/test_text.py::test_complex_bytes[pars0-None] PASSED [ 83%] 158s intake/source/tests/test_text.py::test_complex_bytes[pars0-gzip] PASSED [ 83%] 158s intake/source/tests/test_text.py::test_complex_bytes[pars0-bz2] PASSED [ 83%] 159s intake/source/tests/test_text.py::test_complex_bytes[pars1-None] PASSED [ 84%] 159s intake/source/tests/test_text.py::test_complex_bytes[pars1-gzip] PASSED [ 84%] 159s intake/source/tests/test_text.py::test_complex_bytes[pars1-bz2] PASSED [ 84%] 159s intake/source/tests/test_text.py::test_complex_bytes[pars2-None] PASSED [ 84%] 160s intake/source/tests/test_text.py::test_complex_bytes[pars2-gzip] PASSED [ 85%] 160s intake/source/tests/test_text.py::test_complex_bytes[pars2-bz2] PASSED [ 85%] 160s intake/source/tests/test_text.py::test_complex_bytes[pars3-None] PASSED [ 85%] 160s intake/source/tests/test_text.py::test_complex_bytes[pars3-gzip] PASSED [ 85%] 160s intake/source/tests/test_text.py::test_complex_bytes[pars3-bz2] PASSED [ 86%] 161s intake/source/tests/test_text.py::test_text_persist FAILED [ 86%] 161s intake/source/tests/test_text.py::test_text_export FAILED [ 86%] 161s intake/source/tests/test_utils.py::test_path_to_glob[data/LT05_L1TP_042033_{start_date:%Y%m%d}_{end_date:%Y%m%d}_01_T1_sr_band{band:1d}.tif-data/LT05_L1TP_042033_*_*_01_T1_sr_band*.tif] PASSED [ 86%] 161s intake/source/tests/test_utils.py::test_path_to_glob[data/LT05_L1TP_042033_*_*_01_T1_sr_band*.tif-data/LT05_L1TP_042033_*_*_01_T1_sr_band*.tif] PASSED [ 87%] 161s intake/source/tests/test_utils.py::test_path_to_glob[{year}/{month}/{day}.csv-*/*/*.csv] PASSED [ 87%] 161s intake/source/tests/test_utils.py::test_path_to_glob[data/**/*.csv-data/**/*.csv] PASSED [ 87%] 161s intake/source/tests/test_utils.py::test_path_to_glob[data/{year:4}{month:02}{day:02}.csv-data/*.csv] PASSED [ 87%] 161s intake/source/tests/test_utils.py::test_path_to_glob[{lone_param}-*] PASSED [ 87%] 161s intake/source/tests/test_utils.py::test_reverse_format[*.csv-apple.csv-expected0] PASSED [ 88%] 161s intake/source/tests/test_utils.py::test_reverse_format[{}.csv-apple.csv-expected1] PASSED [ 88%] 161s intake/source/tests/test_utils.py::test_reverse_format[{fruit}.{}-apple.csv-expected2] PASSED [ 88%] 161s intake/source/tests/test_utils.py::test_reverse_format[data//{fruit}.csv-data/apple.csv-expected3] PASSED [ 88%] 161s intake/source/tests/test_utils.py::test_reverse_format[data\\{fruit}.csv-C:\\data\\apple.csv-expected4] PASSED [ 89%] 161s intake/source/tests/test_utils.py::test_reverse_format[data/{fruit}.csv-C:\\data\\apple.csv-expected5] PASSED [ 89%] 161s intake/source/tests/test_utils.py::test_reverse_format[data/{fruit}.csv-data//apple.csv-expected6] PASSED [ 89%] 161s intake/source/tests/test_utils.py::test_reverse_format[{num:d}.csv-k.csv-expected7] PASSED [ 89%] 161s intake/source/tests/test_utils.py::test_reverse_format[{year:d}/{month:d}/{day:d}.csv-2016/2/01.csv-expected8] PASSED [ 90%] 161s intake/source/tests/test_utils.py::test_reverse_format[{year:.4}/{month:.2}/{day:.2}.csv-2016/2/01.csv-expected9] PASSED [ 90%] 161s intake/source/tests/test_utils.py::test_reverse_format[SRLCCTabularDat/Ecoregions_{emissions}_Precip_{model}.csv-/user/examples/SRLCCTabularDat/Ecoregions_a1b_Precip_ECHAM5-MPI.csv-expected10] PASSED [ 90%] 161s intake/source/tests/test_utils.py::test_reverse_format[data_{date:%Y_%m_%d}.csv-data_2016_10_01.csv-expected11] PASSED [ 90%] 161s intake/source/tests/test_utils.py::test_reverse_format[{state}{zip:5}-PA19104-expected12] PASSED [ 91%] 161s intake/source/tests/test_utils.py::test_reverse_format[{state}{zip:5d}.csv-PA19104.csv-expected13] PASSED [ 91%] 161s intake/source/tests/test_utils.py::test_reverse_format[{state:2}{zip:d}.csv-PA19104.csv-expected14] PASSED [ 91%] 161s intake/source/tests/test_utils.py::test_roundtripping_reverse_format[{date:%Y%m%d}-expected0] PASSED [ 91%] 161s intake/source/tests/test_utils.py::test_roundtripping_reverse_format[{num: .2f}-expected1] PASSED [ 91%] 161s intake/source/tests/test_utils.py::test_roundtripping_reverse_format[{percentage:.2%}-expected2] PASSED [ 92%] 161s intake/source/tests/test_utils.py::test_roundtripping_reverse_format[data/{year:4d}{month:02d}{day:02d}.csv-expected3] PASSED [ 92%] 161s intake/source/tests/test_utils.py::test_reverse_format_errors PASSED [ 92%] 161s intake/source/tests/test_utils.py::test_roundtrip_reverse_formats[data_{year}_{month}_{day}.csv] PASSED [ 92%] 161s intake/source/tests/test_utils.py::test_roundtrip_reverse_formats[data_{year:d}_{month:02d}_{day:02d}.csv] PASSED [ 93%] 161s intake/source/tests/test_utils.py::test_roundtrip_reverse_formats[data_{date:%Y_%m_%d}.csv] PASSED [ 93%] 161s intake/source/tests/test_utils.py::test_path_to_pattern[http://data/band{band:1d}.tif-metadata0-/band{band:1d}.tif] PASSED [ 93%] 161s intake/source/tests/test_utils.py::test_path_to_pattern[/data/band{band:1d}.tif-metadata1-/data/band{band:1d}.tif] PASSED [ 93%] 161s intake/source/tests/test_utils.py::test_path_to_pattern[/data/band{band:1d}.tif-None-/data/band{band:1d}.tif] PASSED [ 94%] 161s intake/tests/test_config.py::test_load_conf[conf0] PASSED [ 94%] 161s intake/tests/test_config.py::test_load_conf[conf1] PASSED [ 94%] 161s intake/tests/test_config.py::test_load_conf[conf2] PASSED [ 94%] 162s intake/tests/test_config.py::test_basic PASSED [ 95%] 162s intake/tests/test_config.py::test_cli PASSED [ 95%] 163s intake/tests/test_config.py::test_persist_modes PASSED [ 95%] 163s intake/tests/test_config.py::test_conf PASSED [ 95%] 164s intake/tests/test_config.py::test_conf_auth PASSED [ 95%] 164s intake/tests/test_config.py::test_pathdirs PASSED [ 96%] 164s intake/tests/test_top_level.py::test_autoregister_open PASSED [ 96%] 164s intake/tests/test_top_level.py::test_default_catalogs PASSED [ 96%] 164s intake/tests/test_top_level.py::test_user_catalog PASSED [ 96%] 164s intake/tests/test_top_level.py::test_open_styles PASSED [ 97%] 166s intake/tests/test_top_level.py::test_path_catalog PASSED [ 97%] 166s intake/tests/test_top_level.py::test_bad_open PASSED [ 97%] 166s intake/tests/test_top_level.py::test_output_notebook SKIPPED (could ...) [ 97%] 166s intake/tests/test_top_level.py::test_old_usage PASSED [ 98%] 166s intake/tests/test_top_level.py::test_no_imports PASSED [ 98%] 166s intake/tests/test_top_level.py::test_nested_catalog_access PASSED [ 98%] 166s intake/tests/test_utils.py::test_windows_file_path PASSED [ 98%] 166s intake/tests/test_utils.py::test_make_path_posix_removes_double_sep PASSED [ 99%] 166s intake/tests/test_utils.py::test_noops[~/fake.file] PASSED [ 99%] 166s intake/tests/test_utils.py::test_noops[https://example.com] PASSED [ 99%] 166s intake/tests/test_utils.py::test_roundtrip_file_path PASSED [ 99%] 166s intake/tests/test_utils.py::test_yaml_tuples PASSED [100%] 166s 166s ==================================== ERRORS ==================================== 166s ____________ ERROR at setup of test_no_panel_does_not_raise_errors _____________ 166s 166s attr = 'pytest_plugins' 166s 166s def __getattr__(attr): 166s if attr == 'instance': 166s do_import() 166s > return gl['instance'] 166s E KeyError: 'instance' 166s 166s intake/interface/__init__.py:39: KeyError 166s _______________ ERROR at setup of test_no_panel_display_init_gui _______________ 166s 166s attr = 'pytest_plugins' 166s 166s def __getattr__(attr): 166s if attr == 'instance': 166s do_import() 166s > return gl['instance'] 166s E KeyError: 'instance' 166s 166s intake/interface/__init__.py:39: KeyError 166s ___________________ ERROR at setup of test_display_init_gui ____________________ 166s 166s attr = 'pytest_plugins' 166s 166s def __getattr__(attr): 166s if attr == 'instance': 166s do_import() 166s > return gl['instance'] 166s E KeyError: 'instance' 166s 166s intake/interface/__init__.py:39: KeyError 166s =================================== FAILURES =================================== 166s ______________________________ test_load_textfile ______________________________ 166s 166s catalog_cache = 166s 166s def test_load_textfile(catalog_cache): 166s cat = catalog_cache['text_cache'] 166s cache = cat.cache[0] 166s 166s cache_paths = cache.load(cat._urlpath, output=False) 166s > cache_path = cache_paths[-1] 166s E TypeError: 'NoneType' object is not subscriptable 166s 166s intake/catalog/tests/test_caching_integration.py:53: TypeError 166s _________________________________ test_nested __________________________________ 166s 166s args = ('/tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv',) 166s kwargs = {'storage_options': None} 166s func = .read at 0x624706178fe0> 166s exc = OSError('An error occurred while calling the read_csv method registered to the pandas backend.\nOriginal Message: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files') 166s 166s @wraps(fn) 166s def wrapper(*args, **kwargs): 166s func = getattr(self, dispatch_name) 166s try: 166s > return func(*args, **kwargs) 166s 166s /usr/lib/python3/dist-packages/dask/backends.py:140: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s /usr/lib/python3/dist-packages/dask/dataframe/io/csv.py:877: in read 166s return read_pandas( 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s reader = 166s urlpath = '/tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv' 166s blocksize = 'default', lineterminator = '\n', compression = 'infer' 166s sample = 256000, sample_rows = 10, enforce = False, assume_missing = False 166s storage_options = None, include_path_column = False, kwargs = {} 166s reader_name = 'read_csv', b_lineterminator = b'\n', kw = 'chunksize' 166s lastskiprow = 0, firstrow = 0 166s 166s def read_pandas( 166s reader, 166s urlpath, 166s blocksize="default", 166s lineterminator=None, 166s compression="infer", 166s sample=256000, 166s sample_rows=10, 166s enforce=False, 166s assume_missing=False, 166s storage_options=None, 166s include_path_column=False, 166s **kwargs, 166s ): 166s reader_name = reader.__name__ 166s if lineterminator is not None and len(lineterminator) == 1: 166s kwargs["lineterminator"] = lineterminator 166s else: 166s lineterminator = "\n" 166s if "encoding" in kwargs: 166s b_lineterminator = lineterminator.encode(kwargs["encoding"]) 166s empty_blob = "".encode(kwargs["encoding"]) 166s if empty_blob: 166s # This encoding starts with a Byte Order Mark (BOM), so strip that from the 166s # start of the line terminator, since this value is not a full file. 166s b_lineterminator = b_lineterminator[len(empty_blob) :] 166s else: 166s b_lineterminator = lineterminator.encode() 166s if include_path_column and isinstance(include_path_column, bool): 166s include_path_column = "path" 166s if "index" in kwargs or ( 166s "index_col" in kwargs and kwargs.get("index_col") is not False 166s ): 166s raise ValueError( 166s "Keywords 'index' and 'index_col' not supported, except for " 166s "'index_col=False'. Use dd.{reader_name}(...).set_index('my-index') instead" 166s ) 166s for kw in ["iterator", "chunksize"]: 166s if kw in kwargs: 166s raise ValueError(f"{kw} not supported for dd.{reader_name}") 166s if kwargs.get("nrows", None): 166s raise ValueError( 166s "The 'nrows' keyword is not supported by " 166s "`dd.{0}`. To achieve the same behavior, it's " 166s "recommended to use `dd.{0}(...)." 166s "head(n=nrows)`".format(reader_name) 166s ) 166s if isinstance(kwargs.get("skiprows"), int): 166s lastskiprow = firstrow = kwargs.get("skiprows") 166s elif kwargs.get("skiprows") is None: 166s lastskiprow = firstrow = 0 166s else: 166s # When skiprows is a list, we expect more than max(skiprows) to 166s # be included in the sample. This means that [0,2] will work well, 166s # but [0, 440] might not work. 166s skiprows = set(kwargs.get("skiprows")) 166s lastskiprow = max(skiprows) 166s # find the firstrow that is not skipped, for use as header 166s firstrow = min(set(range(len(skiprows) + 1)) - set(skiprows)) 166s if isinstance(kwargs.get("header"), list): 166s raise TypeError(f"List of header rows not supported for dd.{reader_name}") 166s if isinstance(kwargs.get("converters"), dict) and include_path_column: 166s path_converter = kwargs.get("converters").get(include_path_column, None) 166s else: 166s path_converter = None 166s 166s # If compression is "infer", inspect the (first) path suffix and 166s # set the proper compression option if the suffix is recognized. 166s if compression == "infer": 166s # Translate the input urlpath to a simple path list 166s paths = get_fs_token_paths(urlpath, mode="rb", storage_options=storage_options)[ 166s 2 166s ] 166s 166s # Check for at least one valid path 166s if len(paths) == 0: 166s > raise OSError(f"{urlpath} resolved to no files") 166s E OSError: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files 166s 166s /usr/lib/python3/dist-packages/dask/dataframe/io/csv.py:644: OSError 166s 166s The above exception was the direct cause of the following exception: 166s 166s catalog1 = 166s 166s def test_nested(catalog1): 166s assert 'nested' in catalog1 166s assert 'entry1' in catalog1.nested.nested() 166s > assert catalog1.entry1.read().equals(catalog1.nested.nested.entry1.read()) 166s 166s intake/catalog/tests/test_local.py:86: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s intake/source/csv.py:129: in read 166s self._get_schema() 166s intake/source/csv.py:115: in _get_schema 166s self._open_dataset(urlpath) 166s intake/source/csv.py:94: in _open_dataset 166s self._dataframe = dask.dataframe.read_csv( 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s args = ('/tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv',) 166s kwargs = {'storage_options': None} 166s func = .read at 0x624706178fe0> 166s exc = OSError('An error occurred while calling the read_csv method registered to the pandas backend.\nOriginal Message: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files') 166s 166s @wraps(fn) 166s def wrapper(*args, **kwargs): 166s func = getattr(self, dispatch_name) 166s try: 166s return func(*args, **kwargs) 166s except Exception as e: 166s try: 166s exc = type(e)( 166s f"An error occurred while calling the {funcname(func)} " 166s f"method registered to the {self.backend} backend.\n" 166s f"Original Message: {e}" 166s ) 166s except TypeError: 166s raise e 166s else: 166s > raise exc from e 166s E OSError: An error occurred while calling the read_csv method registered to the pandas backend. 166s E Original Message: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files 166s 166s /usr/lib/python3/dist-packages/dask/backends.py:151: OSError 166s ______________________________ test_info_describe ______________________________ 166s 166s intake_server = 'intake://localhost:7483' 166s 166s def test_info_describe(intake_server): 166s catalog = open_catalog(intake_server) 166s 166s assert_items_equal(list(catalog), ['use_example1', 'nested', 'entry1', 166s 'entry1_part', 'remote_env', 166s 'local_env', 'text', 'arr', 'datetime']) 166s 166s > info = catalog['entry1'].describe() 166s 166s intake/catalog/tests/test_remote_integration.py:29: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s intake/catalog/base.py:436: in __getitem__ 166s s = self._get_entry(key) 166s intake/catalog/utils.py:45: in wrapper 166s return f(self, *args, **kwargs) 166s intake/catalog/base.py:323: in _get_entry 166s return entry() 166s intake/catalog/entry.py:77: in __call__ 166s s = self.get(**kwargs) 166s intake/catalog/remote.py:459: in get 166s return open_remote( 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 166s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 166s page_size = None, persist_mode = 'default' 166s auth = , getenv = True 166s getshell = True 166s 166s def open_remote(url, entry, container, user_parameters, description, http_args, 166s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 166s """Create either local direct data source or remote streamed source""" 166s from intake.container import container_map 166s import msgpack 166s import requests 166s from requests.compat import urljoin 166s 166s if url.startswith('intake://'): 166s url = url[len('intake://'):] 166s payload = dict(action='open', 166s name=entry, 166s parameters=user_parameters, 166s available_plugins=list(plugin_registry)) 166s req = requests.post(urljoin(url, 'v1/source'), 166s data=msgpack.packb(payload, **pack_kwargs), 166s **http_args) 166s if req.ok: 166s response = msgpack.unpackb(req.content, **unpack_kwargs) 166s 166s if 'plugin' in response: 166s pl = response['plugin'] 166s pl = [pl] if isinstance(pl, str) else pl 166s # Direct access 166s for p in pl: 166s if p in plugin_registry: 166s source = plugin_registry[p](**response['args']) 166s proxy = False 166s break 166s else: 166s proxy = True 166s else: 166s proxy = True 166s if proxy: 166s response.pop('container') 166s response.update({'name': entry, 'parameters': user_parameters}) 166s if container == 'catalog': 166s response.update({'auth': auth, 166s 'getenv': getenv, 166s 'getshell': getshell, 166s 'page_size': page_size, 166s 'persist_mode': persist_mode 166s # TODO ttl? 166s # TODO storage_options? 166s }) 166s source = container_map[container](url, http_args, **response) 166s source.description = description 166s return source 166s else: 166s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 166s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files 166s 166s intake/catalog/remote.py:519: Exception 166s ---------------------------- Captured stderr setup ----------------------------- 166s 2025-11-17 19:03:09,484 - intake - INFO - __main__.py:main:L53 - Creating catalog from: 166s 2025-11-17 19:03:09,484 - intake - INFO - __main__.py:main:L55 - - /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests/catalog1.yml 166s 2025-11-17 19:03:09,796 - intake - INFO - __main__.py:main:L62 - catalog_args: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests/catalog1.yml 166s 2025-11-17 19:03:09,796 - intake - INFO - __main__.py:main:L70 - Listening on localhost:7483 166s ----------------------------- Captured stderr call ----------------------------- 166s /usr/lib/python3/dist-packages/dask/dataframe/__init__.py:49: FutureWarning: 166s Dask dataframe query planning is disabled because dask-expr is not installed. 166s 166s You can install it with `pip install dask[dataframe]` or `conda install dask`. 166s This will raise in a future version. 166s 166s warnings.warn(msg, FutureWarning) 166s Traceback (most recent call last): 166s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 166s return func(*args, **kwargs) 166s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 166s return read_pandas( 166s reader, 166s ...<10 lines>... 166s **kwargs, 166s ) 166s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 166s raise OSError(f"{urlpath} resolved to no files") 166s OSError: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files 166s 166s The above exception was the direct cause of the following exception: 166s 166s Traceback (most recent call last): 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/cli/server/server.py", line 306, in post 166s source.discover() 166s ~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/base.py", line 347, in discover 166s self._load_metadata() 166s ~~~~~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/base.py", line 285, in _load_metadata 166s self._schema = self._get_schema() 166s ~~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/csv.py", line 115, in _get_schema 166s self._open_dataset(urlpath) 166s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/csv.py", line 94, in _open_dataset 166s self._dataframe = dask.dataframe.read_csv( 166s ~~~~~~~~~~~~~~~~~~~~~~~^ 166s urlpath, storage_options=self._storage_options, 166s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 166s **self._csv_kwargs) 166s ^^^^^^^^^^^^^^^^^^^ 166s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 166s raise exc from e 166s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 166s Original Message: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files 166s 400 POST /v1/source (::1): Discover failed 166s 400 POST /v1/source (::1) 193.69ms 166s ______________________________ test_remote_direct ______________________________ 166s 166s intake_server = 'intake://localhost:7483' 166s 166s def test_remote_direct(intake_server): 166s from intake.container.dataframe import RemoteDataFrame 166s catalog = open_catalog(intake_server) 166s > s0 = catalog.entry1() 166s 166s intake/catalog/tests/test_remote_integration.py:74: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s intake/catalog/base.py:391: in __getattr__ 166s return self[item] # triggers reload_on_change 166s intake/catalog/base.py:436: in __getitem__ 166s s = self._get_entry(key) 166s intake/catalog/utils.py:45: in wrapper 166s return f(self, *args, **kwargs) 166s intake/catalog/base.py:323: in _get_entry 166s return entry() 166s intake/catalog/entry.py:77: in __call__ 166s s = self.get(**kwargs) 166s intake/catalog/remote.py:459: in get 166s return open_remote( 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 166s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 166s page_size = None, persist_mode = 'default' 166s auth = , getenv = True 166s getshell = True 166s 166s def open_remote(url, entry, container, user_parameters, description, http_args, 166s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 166s """Create either local direct data source or remote streamed source""" 166s from intake.container import container_map 166s import msgpack 166s import requests 166s from requests.compat import urljoin 166s 166s if url.startswith('intake://'): 166s url = url[len('intake://'):] 166s payload = dict(action='open', 166s name=entry, 166s parameters=user_parameters, 166s available_plugins=list(plugin_registry)) 166s req = requests.post(urljoin(url, 'v1/source'), 166s data=msgpack.packb(payload, **pack_kwargs), 166s **http_args) 166s if req.ok: 166s response = msgpack.unpackb(req.content, **unpack_kwargs) 166s 166s if 'plugin' in response: 166s pl = response['plugin'] 166s pl = [pl] if isinstance(pl, str) else pl 166s # Direct access 166s for p in pl: 166s if p in plugin_registry: 166s source = plugin_registry[p](**response['args']) 166s proxy = False 166s break 166s else: 166s proxy = True 166s else: 166s proxy = True 166s if proxy: 166s response.pop('container') 166s response.update({'name': entry, 'parameters': user_parameters}) 166s if container == 'catalog': 166s response.update({'auth': auth, 166s 'getenv': getenv, 166s 'getshell': getshell, 166s 'page_size': page_size, 166s 'persist_mode': persist_mode 166s # TODO ttl? 166s # TODO storage_options? 166s }) 166s source = container_map[container](url, http_args, **response) 166s source.description = description 166s return source 166s else: 166s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 166s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files 166s 166s intake/catalog/remote.py:519: Exception 166s ----------------------------- Captured stderr call ----------------------------- 166s Traceback (most recent call last): 166s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 166s return func(*args, **kwargs) 166s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 166s return read_pandas( 166s reader, 166s ...<10 lines>... 166s **kwargs, 166s ) 166s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 166s raise OSError(f"{urlpath} resolved to no files") 166s OSError: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files 166s 166s The above exception was the direct cause of the following exception: 166s 166s Traceback (most recent call last): 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/cli/server/server.py", line 306, in post 166s source.discover() 166s ~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/base.py", line 347, in discover 166s self._load_metadata() 166s ~~~~~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/base.py", line 285, in _load_metadata 166s self._schema = self._get_schema() 166s ~~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/csv.py", line 115, in _get_schema 166s self._open_dataset(urlpath) 166s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/csv.py", line 94, in _open_dataset 166s self._dataframe = dask.dataframe.read_csv( 166s ~~~~~~~~~~~~~~~~~~~~~~~^ 166s urlpath, storage_options=self._storage_options, 166s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 166s **self._csv_kwargs) 166s ^^^^^^^^^^^^^^^^^^^ 166s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 166s raise exc from e 166s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 166s Original Message: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files 166s 400 POST /v1/source (::1): Discover failed 166s 400 POST /v1/source (::1) 3.41ms 166s _______________________ test_remote_datasource_interface _______________________ 166s 166s intake_server = 'intake://localhost:7483' 166s 166s def test_remote_datasource_interface(intake_server): 166s catalog = open_catalog(intake_server) 166s 166s > d = catalog['entry1'] 166s 166s intake/catalog/tests/test_remote_integration.py:101: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s intake/catalog/base.py:436: in __getitem__ 166s s = self._get_entry(key) 166s intake/catalog/utils.py:45: in wrapper 166s return f(self, *args, **kwargs) 166s intake/catalog/base.py:323: in _get_entry 166s return entry() 166s intake/catalog/entry.py:77: in __call__ 166s s = self.get(**kwargs) 166s intake/catalog/remote.py:459: in get 166s return open_remote( 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 166s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 166s page_size = None, persist_mode = 'default' 166s auth = , getenv = True 166s getshell = True 166s 166s def open_remote(url, entry, container, user_parameters, description, http_args, 166s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 166s """Create either local direct data source or remote streamed source""" 166s from intake.container import container_map 166s import msgpack 166s import requests 166s from requests.compat import urljoin 166s 166s if url.startswith('intake://'): 166s url = url[len('intake://'):] 166s payload = dict(action='open', 166s name=entry, 166s parameters=user_parameters, 166s available_plugins=list(plugin_registry)) 166s req = requests.post(urljoin(url, 'v1/source'), 166s data=msgpack.packb(payload, **pack_kwargs), 166s **http_args) 166s if req.ok: 166s response = msgpack.unpackb(req.content, **unpack_kwargs) 166s 166s if 'plugin' in response: 166s pl = response['plugin'] 166s pl = [pl] if isinstance(pl, str) else pl 166s # Direct access 166s for p in pl: 166s if p in plugin_registry: 166s source = plugin_registry[p](**response['args']) 166s proxy = False 166s break 166s else: 166s proxy = True 166s else: 166s proxy = True 166s if proxy: 166s response.pop('container') 166s response.update({'name': entry, 'parameters': user_parameters}) 166s if container == 'catalog': 166s response.update({'auth': auth, 166s 'getenv': getenv, 166s 'getshell': getshell, 166s 'page_size': page_size, 166s 'persist_mode': persist_mode 166s # TODO ttl? 166s # TODO storage_options? 166s }) 166s source = container_map[container](url, http_args, **response) 166s source.description = description 166s return source 166s else: 166s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 166s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files 166s 166s intake/catalog/remote.py:519: Exception 166s ----------------------------- Captured stderr call ----------------------------- 166s Traceback (most recent call last): 166s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 166s return func(*args, **kwargs) 166s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 166s return read_pandas( 166s reader, 166s ...<10 lines>... 166s **kwargs, 166s ) 166s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 166s raise OSError(f"{urlpath} resolved to no files") 166s OSError: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files 166s 166s The above exception was the direct cause of the following exception: 166s 166s Traceback (most recent call last): 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/cli/server/server.py", line 306, in post 166s source.discover() 166s ~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/base.py", line 347, in discover 166s self._load_metadata() 166s ~~~~~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/base.py", line 285, in _load_metadata 166s self._schema = self._get_schema() 166s ~~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/csv.py", line 115, in _get_schema 166s self._open_dataset(urlpath) 166s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/csv.py", line 94, in _open_dataset 166s self._dataframe = dask.dataframe.read_csv( 166s ~~~~~~~~~~~~~~~~~~~~~~~^ 166s urlpath, storage_options=self._storage_options, 166s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 166s **self._csv_kwargs) 166s ^^^^^^^^^^^^^^^^^^^ 166s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 166s raise exc from e 166s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 166s Original Message: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files 166s 400 POST /v1/source (::1): Discover failed 166s 400 POST /v1/source (::1) 3.35ms 166s __________________________________ test_read ___________________________________ 166s 166s intake_server = 'intake://localhost:7483' 166s 166s def test_read(intake_server): 166s catalog = open_catalog(intake_server) 166s 166s > d = catalog['entry1'] 166s 166s intake/catalog/tests/test_remote_integration.py:116: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s intake/catalog/base.py:436: in __getitem__ 166s s = self._get_entry(key) 166s intake/catalog/utils.py:45: in wrapper 166s return f(self, *args, **kwargs) 166s intake/catalog/base.py:323: in _get_entry 166s return entry() 166s intake/catalog/entry.py:77: in __call__ 166s s = self.get(**kwargs) 166s intake/catalog/remote.py:459: in get 166s return open_remote( 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 166s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 166s page_size = None, persist_mode = 'default' 166s auth = , getenv = True 166s getshell = True 166s 166s def open_remote(url, entry, container, user_parameters, description, http_args, 166s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 166s """Create either local direct data source or remote streamed source""" 166s from intake.container import container_map 166s import msgpack 166s import requests 166s from requests.compat import urljoin 166s 166s if url.startswith('intake://'): 166s url = url[len('intake://'):] 166s payload = dict(action='open', 166s name=entry, 166s parameters=user_parameters, 166s available_plugins=list(plugin_registry)) 166s req = requests.post(urljoin(url, 'v1/source'), 166s data=msgpack.packb(payload, **pack_kwargs), 166s **http_args) 166s if req.ok: 166s response = msgpack.unpackb(req.content, **unpack_kwargs) 166s 166s if 'plugin' in response: 166s pl = response['plugin'] 166s pl = [pl] if isinstance(pl, str) else pl 166s # Direct access 166s for p in pl: 166s if p in plugin_registry: 166s source = plugin_registry[p](**response['args']) 166s proxy = False 166s break 166s else: 166s proxy = True 166s else: 166s proxy = True 166s if proxy: 166s response.pop('container') 166s response.update({'name': entry, 'parameters': user_parameters}) 166s if container == 'catalog': 166s response.update({'auth': auth, 166s 'getenv': getenv, 166s 'getshell': getshell, 166s 'page_size': page_size, 166s 'persist_mode': persist_mode 166s # TODO ttl? 166s # TODO storage_options? 166s }) 166s source = container_map[container](url, http_args, **response) 166s source.description = description 166s return source 166s else: 166s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 166s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files 166s 166s intake/catalog/remote.py:519: Exception 166s ----------------------------- Captured stderr call ----------------------------- 166s Traceback (most recent call last): 166s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 166s return func(*args, **kwargs) 166s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 166s return read_pandas( 166s reader, 166s ...<10 lines>... 166s **kwargs, 166s ) 166s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 166s raise OSError(f"{urlpath} resolved to no files") 166s OSError: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files 166s 166s The above exception was the direct cause of the following exception: 166s 166s Traceback (most recent call last): 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/cli/server/server.py", line 306, in post 166s source.discover() 166s ~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/base.py", line 347, in discover 166s self._load_metadata() 166s ~~~~~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/base.py", line 285, in _load_metadata 166s self._schema = self._get_schema() 166s ~~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/csv.py", line 115, in _get_schema 166s self._open_dataset(urlpath) 166s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/csv.py", line 94, in _open_dataset 166s self._dataframe = dask.dataframe.read_csv( 166s ~~~~~~~~~~~~~~~~~~~~~~~^ 166s urlpath, storage_options=self._storage_options, 166s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 166s **self._csv_kwargs) 166s ^^^^^^^^^^^^^^^^^^^ 166s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 166s raise exc from e 166s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 166s Original Message: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files 166s 400 POST /v1/source (::1): Discover failed 166s --------------------------- Captured stderr teardown --------------------------- 166s 400 POST /v1/source (::1) 6.61ms 166s _______________________________ test_read_chunks _______________________________ 166s 166s intake_server = 'intake://localhost:7483' 166s 166s def test_read_chunks(intake_server): 166s catalog = open_catalog(intake_server) 166s 166s > d = catalog.entry1 166s 166s intake/catalog/tests/test_remote_integration.py:170: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s intake/catalog/base.py:391: in __getattr__ 166s return self[item] # triggers reload_on_change 166s intake/catalog/base.py:436: in __getitem__ 166s s = self._get_entry(key) 166s intake/catalog/utils.py:45: in wrapper 166s return f(self, *args, **kwargs) 166s intake/catalog/base.py:323: in _get_entry 166s return entry() 166s intake/catalog/entry.py:77: in __call__ 166s s = self.get(**kwargs) 166s intake/catalog/remote.py:459: in get 166s return open_remote( 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 166s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 166s page_size = None, persist_mode = 'default' 166s auth = , getenv = True 166s getshell = True 166s 166s def open_remote(url, entry, container, user_parameters, description, http_args, 166s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 166s """Create either local direct data source or remote streamed source""" 166s from intake.container import container_map 166s import msgpack 166s import requests 166s from requests.compat import urljoin 166s 166s if url.startswith('intake://'): 166s url = url[len('intake://'):] 166s payload = dict(action='open', 166s name=entry, 166s parameters=user_parameters, 166s available_plugins=list(plugin_registry)) 166s req = requests.post(urljoin(url, 'v1/source'), 166s data=msgpack.packb(payload, **pack_kwargs), 166s **http_args) 166s if req.ok: 166s response = msgpack.unpackb(req.content, **unpack_kwargs) 166s 166s if 'plugin' in response: 166s pl = response['plugin'] 166s pl = [pl] if isinstance(pl, str) else pl 166s # Direct access 166s for p in pl: 166s if p in plugin_registry: 166s source = plugin_registry[p](**response['args']) 166s proxy = False 166s break 166s else: 166s proxy = True 166s else: 166s proxy = True 166s if proxy: 166s response.pop('container') 166s response.update({'name': entry, 'parameters': user_parameters}) 166s if container == 'catalog': 166s response.update({'auth': auth, 166s 'getenv': getenv, 166s 'getshell': getshell, 166s 'page_size': page_size, 166s 'persist_mode': persist_mode 166s # TODO ttl? 166s # TODO storage_options? 166s }) 166s source = container_map[container](url, http_args, **response) 166s source.description = description 166s return source 166s else: 166s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 166s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files 166s 166s intake/catalog/remote.py:519: Exception 166s ----------------------------- Captured stderr call ----------------------------- 166s Traceback (most recent call last): 166s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 166s return func(*args, **kwargs) 166s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 166s return read_pandas( 166s reader, 166s ...<10 lines>... 166s **kwargs, 166s ) 166s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 166s raise OSError(f"{urlpath} resolved to no files") 166s OSError: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files 166s 166s The above exception was the direct cause of the following exception: 166s 166s Traceback (most recent call last): 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/cli/server/server.py", line 306, in post 166s source.discover() 166s ~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/base.py", line 347, in discover 166s self._load_metadata() 166s ~~~~~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/base.py", line 285, in _load_metadata 166s self._schema = self._get_schema() 166s ~~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/csv.py", line 115, in _get_schema 166s self._open_dataset(urlpath) 166s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/csv.py", line 94, in _open_dataset 166s self._dataframe = dask.dataframe.read_csv( 166s ~~~~~~~~~~~~~~~~~~~~~~~^ 166s urlpath, storage_options=self._storage_options, 166s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 166s **self._csv_kwargs) 166s ^^^^^^^^^^^^^^^^^^^ 166s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 166s raise exc from e 166s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 166s Original Message: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files 166s 400 POST /v1/source (::1): Discover failed 166s 400 POST /v1/source (::1) 3.31ms 166s _____________________________ test_read_partition ______________________________ 166s 166s intake_server = 'intake://localhost:7483' 166s 166s def test_read_partition(intake_server): 166s catalog = open_catalog(intake_server) 166s 166s > d = catalog.entry1 166s 166s intake/catalog/tests/test_remote_integration.py:186: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s intake/catalog/base.py:391: in __getattr__ 166s return self[item] # triggers reload_on_change 166s intake/catalog/base.py:436: in __getitem__ 166s s = self._get_entry(key) 166s intake/catalog/utils.py:45: in wrapper 166s return f(self, *args, **kwargs) 166s intake/catalog/base.py:323: in _get_entry 166s return entry() 166s intake/catalog/entry.py:77: in __call__ 166s s = self.get(**kwargs) 166s intake/catalog/remote.py:459: in get 166s return open_remote( 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 166s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 166s page_size = None, persist_mode = 'default' 166s auth = , getenv = True 166s getshell = True 166s 166s def open_remote(url, entry, container, user_parameters, description, http_args, 166s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 166s """Create either local direct data source or remote streamed source""" 166s from intake.container import container_map 166s import msgpack 166s import requests 166s from requests.compat import urljoin 166s 166s if url.startswith('intake://'): 166s url = url[len('intake://'):] 166s payload = dict(action='open', 166s name=entry, 166s parameters=user_parameters, 166s available_plugins=list(plugin_registry)) 166s req = requests.post(urljoin(url, 'v1/source'), 166s data=msgpack.packb(payload, **pack_kwargs), 166s **http_args) 166s if req.ok: 166s response = msgpack.unpackb(req.content, **unpack_kwargs) 166s 166s if 'plugin' in response: 166s pl = response['plugin'] 166s pl = [pl] if isinstance(pl, str) else pl 166s # Direct access 166s for p in pl: 166s if p in plugin_registry: 166s source = plugin_registry[p](**response['args']) 166s proxy = False 166s break 166s else: 166s proxy = True 166s else: 166s proxy = True 166s if proxy: 166s response.pop('container') 166s response.update({'name': entry, 'parameters': user_parameters}) 166s if container == 'catalog': 166s response.update({'auth': auth, 166s 'getenv': getenv, 166s 'getshell': getshell, 166s 'page_size': page_size, 166s 'persist_mode': persist_mode 166s # TODO ttl? 166s # TODO storage_options? 166s }) 166s source = container_map[container](url, http_args, **response) 166s source.description = description 166s return source 166s else: 166s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 166s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files 166s 166s intake/catalog/remote.py:519: Exception 166s ----------------------------- Captured stderr call ----------------------------- 166s Traceback (most recent call last): 166s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 166s return func(*args, **kwargs) 166s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 166s return read_pandas( 166s reader, 166s ...<10 lines>... 166s **kwargs, 166s ) 166s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 166s raise OSError(f"{urlpath} resolved to no files") 166s OSError: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files 166s 166s The above exception was the direct cause of the following exception: 166s 166s Traceback (most recent call last): 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/cli/server/server.py", line 306, in post 166s source.discover() 166s ~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/base.py", line 347, in discover 166s self._load_metadata() 166s ~~~~~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/base.py", line 285, in _load_metadata 166s self._schema = self._get_schema() 166s ~~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/csv.py", line 115, in _get_schema 166s self._open_dataset(urlpath) 166s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/csv.py", line 94, in _open_dataset 166s self._dataframe = dask.dataframe.read_csv( 166s ~~~~~~~~~~~~~~~~~~~~~~~^ 166s urlpath, storage_options=self._storage_options, 166s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 166s **self._csv_kwargs) 166s ^^^^^^^^^^^^^^^^^^^ 166s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 166s raise exc from e 166s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 166s Original Message: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files 166s 400 POST /v1/source (::1): Discover failed 166s 400 POST /v1/source (::1) 3.29ms 166s __________________________________ test_close __________________________________ 166s 166s intake_server = 'intake://localhost:7483' 166s 166s def test_close(intake_server): 166s catalog = open_catalog(intake_server) 166s 166s > d = catalog.entry1 166s 166s intake/catalog/tests/test_remote_integration.py:201: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s intake/catalog/base.py:391: in __getattr__ 166s return self[item] # triggers reload_on_change 166s intake/catalog/base.py:436: in __getitem__ 166s s = self._get_entry(key) 166s intake/catalog/utils.py:45: in wrapper 166s return f(self, *args, **kwargs) 166s intake/catalog/base.py:323: in _get_entry 166s return entry() 166s intake/catalog/entry.py:77: in __call__ 166s s = self.get(**kwargs) 166s intake/catalog/remote.py:459: in get 166s return open_remote( 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 166s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 166s page_size = None, persist_mode = 'default' 166s auth = , getenv = True 166s getshell = True 166s 166s def open_remote(url, entry, container, user_parameters, description, http_args, 166s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 166s """Create either local direct data source or remote streamed source""" 166s from intake.container import container_map 166s import msgpack 166s import requests 166s from requests.compat import urljoin 166s 166s if url.startswith('intake://'): 166s url = url[len('intake://'):] 166s payload = dict(action='open', 166s name=entry, 166s parameters=user_parameters, 166s available_plugins=list(plugin_registry)) 166s req = requests.post(urljoin(url, 'v1/source'), 166s data=msgpack.packb(payload, **pack_kwargs), 166s **http_args) 166s if req.ok: 166s response = msgpack.unpackb(req.content, **unpack_kwargs) 166s 166s if 'plugin' in response: 166s pl = response['plugin'] 166s pl = [pl] if isinstance(pl, str) else pl 166s # Direct access 166s for p in pl: 166s if p in plugin_registry: 166s source = plugin_registry[p](**response['args']) 166s proxy = False 166s break 166s else: 166s proxy = True 166s else: 166s proxy = True 166s if proxy: 166s response.pop('container') 166s response.update({'name': entry, 'parameters': user_parameters}) 166s if container == 'catalog': 166s response.update({'auth': auth, 166s 'getenv': getenv, 166s 'getshell': getshell, 166s 'page_size': page_size, 166s 'persist_mode': persist_mode 166s # TODO ttl? 166s # TODO storage_options? 166s }) 166s source = container_map[container](url, http_args, **response) 166s source.description = description 166s return source 166s else: 166s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 166s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files 166s 166s intake/catalog/remote.py:519: Exception 166s ----------------------------- Captured stderr call ----------------------------- 166s Traceback (most recent call last): 166s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 166s return func(*args, **kwargs) 166s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 166s return read_pandas( 166s reader, 166s ...<10 lines>... 166s **kwargs, 166s ) 166s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 166s raise OSError(f"{urlpath} resolved to no files") 166s OSError: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files 166s 166s The above exception was the direct cause of the following exception: 166s 166s Traceback (most recent call last): 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/cli/server/server.py", line 306, in post 166s source.discover() 166s ~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/base.py", line 347, in discover 166s self._load_metadata() 166s ~~~~~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/base.py", line 285, in _load_metadata 166s self._schema = self._get_schema() 166s ~~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/csv.py", line 115, in _get_schema 166s self._open_dataset(urlpath) 166s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/csv.py", line 94, in _open_dataset 166s self._dataframe = dask.dataframe.read_csv( 166s ~~~~~~~~~~~~~~~~~~~~~~~^ 166s urlpath, storage_options=self._storage_options, 166s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 166s **self._csv_kwargs) 166s ^^^^^^^^^^^^^^^^^^^ 166s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 166s raise exc from e 166s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 166s Original Message: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files 166s 400 POST /v1/source (::1): Discover failed 166s 400 POST /v1/source (::1) 3.29ms 166s __________________________________ test_with ___________________________________ 166s 166s intake_server = 'intake://localhost:7483' 166s 166s def test_with(intake_server): 166s catalog = open_catalog(intake_server) 166s 166s > with catalog.entry1 as f: 166s 166s intake/catalog/tests/test_remote_integration.py:208: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s intake/catalog/base.py:391: in __getattr__ 166s return self[item] # triggers reload_on_change 166s intake/catalog/base.py:436: in __getitem__ 166s s = self._get_entry(key) 166s intake/catalog/utils.py:45: in wrapper 166s return f(self, *args, **kwargs) 166s intake/catalog/base.py:323: in _get_entry 166s return entry() 166s intake/catalog/entry.py:77: in __call__ 166s s = self.get(**kwargs) 166s intake/catalog/remote.py:459: in get 166s return open_remote( 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 166s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 166s page_size = None, persist_mode = 'default' 166s auth = , getenv = True 166s getshell = True 166s 166s def open_remote(url, entry, container, user_parameters, description, http_args, 166s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 166s """Create either local direct data source or remote streamed source""" 166s from intake.container import container_map 166s import msgpack 166s import requests 166s from requests.compat import urljoin 166s 166s if url.startswith('intake://'): 166s url = url[len('intake://'):] 166s payload = dict(action='open', 166s name=entry, 166s parameters=user_parameters, 166s available_plugins=list(plugin_registry)) 166s req = requests.post(urljoin(url, 'v1/source'), 166s data=msgpack.packb(payload, **pack_kwargs), 166s **http_args) 166s if req.ok: 166s response = msgpack.unpackb(req.content, **unpack_kwargs) 166s 166s if 'plugin' in response: 166s pl = response['plugin'] 166s pl = [pl] if isinstance(pl, str) else pl 166s # Direct access 166s for p in pl: 166s if p in plugin_registry: 166s source = plugin_registry[p](**response['args']) 166s proxy = False 166s break 166s else: 166s proxy = True 166s else: 166s proxy = True 166s if proxy: 166s response.pop('container') 166s response.update({'name': entry, 'parameters': user_parameters}) 166s if container == 'catalog': 166s response.update({'auth': auth, 166s 'getenv': getenv, 166s 'getshell': getshell, 166s 'page_size': page_size, 166s 'persist_mode': persist_mode 166s # TODO ttl? 166s # TODO storage_options? 166s }) 166s source = container_map[container](url, http_args, **response) 166s source.description = description 166s return source 166s else: 166s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 166s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files 166s 166s intake/catalog/remote.py:519: Exception 166s ----------------------------- Captured stderr call ----------------------------- 166s Traceback (most recent call last): 166s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 166s return func(*args, **kwargs) 166s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 166s return read_pandas( 166s reader, 166s ...<10 lines>... 166s **kwargs, 166s ) 166s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 166s raise OSError(f"{urlpath} resolved to no files") 166s OSError: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files 166s 166s The above exception was the direct cause of the following exception: 166s 166s Traceback (most recent call last): 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/cli/server/server.py", line 306, in post 166s source.discover() 166s ~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/base.py", line 347, in discover 166s self._load_metadata() 166s ~~~~~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/base.py", line 285, in _load_metadata 166s self._schema = self._get_schema() 166s ~~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/csv.py", line 115, in _get_schema 166s self._open_dataset(urlpath) 166s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/csv.py", line 94, in _open_dataset 166s self._dataframe = dask.dataframe.read_csv( 166s ~~~~~~~~~~~~~~~~~~~~~~~^ 166s urlpath, storage_options=self._storage_options, 166s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 166s **self._csv_kwargs) 166s ^^^^^^^^^^^^^^^^^^^ 166s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 166s raise exc from e 166s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 166s Original Message: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files 166s 400 POST /v1/source (::1): Discover failed 166s 400 POST /v1/source (::1) 3.87ms 166s _________________________________ test_pickle __________________________________ 166s 166s intake_server = 'intake://localhost:7483' 166s 166s def test_pickle(intake_server): 166s catalog = open_catalog(intake_server) 166s 166s > d = catalog.entry1 166s 166s intake/catalog/tests/test_remote_integration.py:215: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s intake/catalog/base.py:391: in __getattr__ 166s return self[item] # triggers reload_on_change 166s intake/catalog/base.py:436: in __getitem__ 166s s = self._get_entry(key) 166s intake/catalog/utils.py:45: in wrapper 166s return f(self, *args, **kwargs) 166s intake/catalog/base.py:323: in _get_entry 166s return entry() 166s intake/catalog/entry.py:77: in __call__ 166s s = self.get(**kwargs) 166s intake/catalog/remote.py:459: in get 166s return open_remote( 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 166s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 166s page_size = None, persist_mode = 'default' 166s auth = , getenv = True 166s getshell = True 166s 166s def open_remote(url, entry, container, user_parameters, description, http_args, 166s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 166s """Create either local direct data source or remote streamed source""" 166s from intake.container import container_map 166s import msgpack 166s import requests 166s from requests.compat import urljoin 166s 166s if url.startswith('intake://'): 166s url = url[len('intake://'):] 166s payload = dict(action='open', 166s name=entry, 166s parameters=user_parameters, 166s available_plugins=list(plugin_registry)) 166s req = requests.post(urljoin(url, 'v1/source'), 166s data=msgpack.packb(payload, **pack_kwargs), 166s **http_args) 166s if req.ok: 166s response = msgpack.unpackb(req.content, **unpack_kwargs) 166s 166s if 'plugin' in response: 166s pl = response['plugin'] 166s pl = [pl] if isinstance(pl, str) else pl 166s # Direct access 166s for p in pl: 166s if p in plugin_registry: 166s source = plugin_registry[p](**response['args']) 166s proxy = False 166s break 166s else: 166s proxy = True 166s else: 166s proxy = True 166s if proxy: 166s response.pop('container') 166s response.update({'name': entry, 'parameters': user_parameters}) 166s if container == 'catalog': 166s response.update({'auth': auth, 166s 'getenv': getenv, 166s 'getshell': getshell, 166s 'page_size': page_size, 166s 'persist_mode': persist_mode 166s # TODO ttl? 166s # TODO storage_options? 166s }) 166s source = container_map[container](url, http_args, **response) 166s source.description = description 166s return source 166s else: 166s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 166s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files 166s 166s intake/catalog/remote.py:519: Exception 166s ----------------------------- Captured stderr call ----------------------------- 166s Traceback (most recent call last): 166s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 166s return func(*args, **kwargs) 166s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 166s return read_pandas( 166s reader, 166s ...<10 lines>... 166s **kwargs, 166s ) 166s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 166s raise OSError(f"{urlpath} resolved to no files") 166s OSError: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files 166s 166s The above exception was the direct cause of the following exception: 166s 166s Traceback (most recent call last): 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/cli/server/server.py", line 306, in post 166s source.discover() 166s ~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/base.py", line 347, in discover 166s self._load_metadata() 166s ~~~~~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/base.py", line 285, in _load_metadata 166s self._schema = self._get_schema() 166s ~~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/csv.py", line 115, in _get_schema 166s self._open_dataset(urlpath) 166s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/csv.py", line 94, in _open_dataset 166s self._dataframe = dask.dataframe.read_csv( 166s ~~~~~~~~~~~~~~~~~~~~~~~^ 166s urlpath, storage_options=self._storage_options, 166s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 166s **self._csv_kwargs) 166s ^^^^^^^^^^^^^^^^^^^ 166s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 166s raise exc from e 166s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 166s Original Message: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files 166s 400 POST /v1/source (::1): Discover failed 166s 400 POST /v1/source (::1) 3.34ms 166s _________________________________ test_to_dask _________________________________ 166s 166s intake_server = 'intake://localhost:7483' 166s 166s def test_to_dask(intake_server): 166s catalog = open_catalog(intake_server) 166s > d = catalog.entry1 166s 166s intake/catalog/tests/test_remote_integration.py:231: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s intake/catalog/base.py:391: in __getattr__ 166s return self[item] # triggers reload_on_change 166s intake/catalog/base.py:436: in __getitem__ 166s s = self._get_entry(key) 166s intake/catalog/utils.py:45: in wrapper 166s return f(self, *args, **kwargs) 166s intake/catalog/base.py:323: in _get_entry 166s return entry() 166s intake/catalog/entry.py:77: in __call__ 166s s = self.get(**kwargs) 166s intake/catalog/remote.py:459: in get 166s return open_remote( 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 166s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 166s page_size = None, persist_mode = 'default' 166s auth = , getenv = True 166s getshell = True 166s 166s def open_remote(url, entry, container, user_parameters, description, http_args, 166s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 166s """Create either local direct data source or remote streamed source""" 166s from intake.container import container_map 166s import msgpack 166s import requests 166s from requests.compat import urljoin 166s 166s if url.startswith('intake://'): 166s url = url[len('intake://'):] 166s payload = dict(action='open', 166s name=entry, 166s parameters=user_parameters, 166s available_plugins=list(plugin_registry)) 166s req = requests.post(urljoin(url, 'v1/source'), 166s data=msgpack.packb(payload, **pack_kwargs), 166s **http_args) 166s if req.ok: 166s response = msgpack.unpackb(req.content, **unpack_kwargs) 166s 166s if 'plugin' in response: 166s pl = response['plugin'] 166s pl = [pl] if isinstance(pl, str) else pl 166s # Direct access 166s for p in pl: 166s if p in plugin_registry: 166s source = plugin_registry[p](**response['args']) 166s proxy = False 166s break 166s else: 166s proxy = True 166s else: 166s proxy = True 166s if proxy: 166s response.pop('container') 166s response.update({'name': entry, 'parameters': user_parameters}) 166s if container == 'catalog': 166s response.update({'auth': auth, 166s 'getenv': getenv, 166s 'getshell': getshell, 166s 'page_size': page_size, 166s 'persist_mode': persist_mode 166s # TODO ttl? 166s # TODO storage_options? 166s }) 166s source = container_map[container](url, http_args, **response) 166s source.description = description 166s return source 166s else: 166s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 166s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files 166s 166s intake/catalog/remote.py:519: Exception 166s ----------------------------- Captured stderr call ----------------------------- 166s Traceback (most recent call last): 166s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 166s return func(*args, **kwargs) 166s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 166s return read_pandas( 166s reader, 166s ...<10 lines>... 166s **kwargs, 166s ) 166s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 166s raise OSError(f"{urlpath} resolved to no files") 166s OSError: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files 166s 166s The above exception was the direct cause of the following exception: 166s 166s Traceback (most recent call last): 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/cli/server/server.py", line 306, in post 166s source.discover() 166s ~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/base.py", line 347, in discover 166s self._load_metadata() 166s ~~~~~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/base.py", line 285, in _load_metadata 166s self._schema = self._get_schema() 166s ~~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/csv.py", line 115, in _get_schema 166s self._open_dataset(urlpath) 166s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/csv.py", line 94, in _open_dataset 166s self._dataframe = dask.dataframe.read_csv( 166s ~~~~~~~~~~~~~~~~~~~~~~~^ 166s urlpath, storage_options=self._storage_options, 166s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 166s **self._csv_kwargs) 166s ^^^^^^^^^^^^^^^^^^^ 166s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 166s raise exc from e 166s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 166s Original Message: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests//entry1_*.csv resolved to no files 166s 400 POST /v1/source (::1): Discover failed 166s 400 POST /v1/source (::1) 3.31ms 166s _____________________________ test_remote_sequence _____________________________ 166s 166s intake_server = 'intake://localhost:7483' 166s 166s def test_remote_sequence(intake_server): 166s import glob 166s d = os.path.dirname(TEST_CATALOG_PATH) 166s catalog = open_catalog(intake_server) 166s assert 'text' in catalog 166s s = catalog.text() 166s s.discover() 166s > assert s.npartitions == len(glob.glob(os.path.join(d, '*.yml'))) 166s E AssertionError: assert 0 == 29 166s E + where 0 = sources:\n text:\n args:\n dtype: null\n extra_metadata:\n catalog_dir: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests/\n headers:\n headers: {}\n name: text\n npartitions: 0\n parameters: {}\n shape:\n - null\n source_id: e53980c6-70d1-4e0f-9e2b-e8189d61e706\n url: http://localhost:7483/\n description: textfiles in this dir\n driver: intake.container.semistructured.RemoteSequenceSource\n metadata:\n catalog_dir: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests/\n.npartitions 166s E + and 29 = len(['/tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests/data_source_name_non_string.yml', '/tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests/plugins_non_dict.yml', '/tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests/catalog_dup_parameters.yml', '/tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests/params_value_non_dict.yml', '/tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests/catalog1.yml', '/tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests/catalog_dup_sources.yml', ...]) 166s E + where ['/tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests/data_source_name_non_string.yml', '/tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests/plugins_non_dict.yml', '/tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests/catalog_dup_parameters.yml', '/tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests/params_value_non_dict.yml', '/tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests/catalog1.yml', '/tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests/catalog_dup_sources.yml', ...] = ('/tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests/*.yml') 166s E + where = .glob 166s E + and '/tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests/*.yml' = ('/tmp/autopkgtest.sDpKou/build.wv0/src/intake/catalog/tests', '*.yml') 166s E + where = .join 166s E + where = os.path 166s 166s intake/catalog/tests/test_remote_integration.py:263: AssertionError 166s ___________________________________ test_dir ___________________________________ 166s 166s intake_server = 'intake://localhost:7483' 166s 166s def test_dir(intake_server): 166s PAGE_SIZE = 2 166s catalog = open_catalog(intake_server, page_size=PAGE_SIZE) 166s assert len(catalog._entries._page_cache) == 0 166s assert len(catalog._entries._direct_lookup_cache) == 0 166s assert not catalog._entries.complete 166s 166s with pytest.warns(UserWarning, match="Tab-complete"): 166s key_completions = catalog._ipython_key_completions_() 166s with pytest.warns(UserWarning, match="Tab-complete"): 166s dir_ = dir(catalog) 166s # __dir__ triggers loading the first page. 166s assert len(catalog._entries._page_cache) == 2 166s assert len(catalog._entries._direct_lookup_cache) == 0 166s assert not catalog._entries.complete 166s assert set(key_completions) == set(['use_example1', 'nested']) 166s assert 'metadata' in dir_ # a normal attribute 166s assert 'use_example1' in dir_ # an entry from the first page 166s assert 'arr' not in dir_ # an entry we haven't cached yet 166s 166s # Trigger fetching one specific name. 166s catalog['arr'] 166s with pytest.warns(UserWarning, match="Tab-complete"): 166s dir_ = dir(catalog) 166s with pytest.warns(UserWarning, match="Tab-complete"): 166s key_completions = catalog._ipython_key_completions_() 166s assert 'metadata' in dir_ 166s assert 'arr' in dir_ # an entry cached via direct access 166s assert 'arr' in key_completions 166s 166s # Load everything. 166s list(catalog) 166s assert catalog._entries.complete 166s > with pytest.warns(None) as record: 166s 166s intake/catalog/tests/test_remote_integration.py:338: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s self = WarningsChecker(record=True), expected_warning = None, match_expr = None 166s 166s def __init__( 166s self, 166s expected_warning: type[Warning] | tuple[type[Warning], ...] = Warning, 166s match_expr: str | Pattern[str] | None = None, 166s *, 166s _ispytest: bool = False, 166s ) -> None: 166s check_ispytest(_ispytest) 166s super().__init__(_ispytest=True) 166s 166s msg = "exceptions must be derived from Warning, not %s" 166s if isinstance(expected_warning, tuple): 166s for exc in expected_warning: 166s if not issubclass(exc, Warning): 166s raise TypeError(msg % type(exc)) 166s expected_warning_tup = expected_warning 166s elif isinstance(expected_warning, type) and issubclass( 166s expected_warning, Warning 166s ): 166s expected_warning_tup = (expected_warning,) 166s else: 166s > raise TypeError(msg % type(expected_warning)) 166s E TypeError: exceptions must be derived from Warning, not 166s 166s /usr/lib/python3/dist-packages/_pytest/recwarn.py:279: TypeError 166s ________________________________ test_discover _________________________________ 166s 166s def test_discover(): 166s cmd = [ex, '-m', 'intake.cli.client', 'discover', TEST_CATALOG_YAML, 166s 'entry1'] 166s process = subprocess.Popen(cmd, stdout=subprocess.PIPE, 166s universal_newlines=True) 166s out, _ = process.communicate() 166s 166s > assert "'dtype':" in out 166s E assert "'dtype':" in '' 166s 166s intake/cli/client/tests/test_local_integration.py:89: AssertionError 166s ----------------------------- Captured stderr call ----------------------------- 166s /usr/lib/python3/dist-packages/dask/dataframe/__init__.py:49: FutureWarning: 166s Dask dataframe query planning is disabled because dask-expr is not installed. 166s 166s You can install it with `pip install dask[dataframe]` or `conda install dask`. 166s This will raise in a future version. 166s 166s warnings.warn(msg, FutureWarning) 166s ERROR: OSError('An error occurred while calling the read_csv method registered to the pandas backend.\nOriginal Message: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/cli/client/tests//entry1_*.csv resolved to no files') 166s ________________________________ test_get_pass _________________________________ 166s 166s def test_get_pass(): 166s cmd = [ex, '-m', 'intake.cli.client', 'get', TEST_CATALOG_YAML, 'entry1'] 166s process = subprocess.Popen(cmd, stdout=subprocess.PIPE, 166s universal_newlines=True) 166s out, _ = process.communicate() 166s 166s > assert 'Charlie1 25.0 3' in out 166s E AssertionError: assert 'Charlie1 25.0 3' in '' 166s 166s intake/cli/client/tests/test_local_integration.py:101: AssertionError 166s ----------------------------- Captured stderr call ----------------------------- 166s /usr/lib/python3/dist-packages/dask/dataframe/__init__.py:49: FutureWarning: 166s Dask dataframe query planning is disabled because dask-expr is not installed. 166s 166s You can install it with `pip install dask[dataframe]` or `conda install dask`. 166s This will raise in a future version. 166s 166s warnings.warn(msg, FutureWarning) 166s ERROR: OSError('An error occurred while calling the read_csv method registered to the pandas backend.\nOriginal Message: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/cli/client/tests//entry1_*.csv resolved to no files') 166s ______________________ TestServerV1Source.test_idle_timer ______________________ 166s 166s self = 166s 166s def test_idle_timer(self): 166s self.server.start_periodic_functions(close_idle_after=0.1, 166s remove_idle_after=0.2) 166s 166s msg = dict(action='open', name='entry1', parameters={}) 166s > resp_msg, = self.make_post_request(msg) 166s 166s intake/cli/server/tests/test_server.py:208: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s intake/cli/server/tests/test_server.py:96: in make_post_request 166s self.assertEqual(response.code, expected_status) 166s E AssertionError: 400 != 200 166s ----------------------------- Captured stderr call ----------------------------- 166s Traceback (most recent call last): 166s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 166s return func(*args, **kwargs) 166s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 166s return read_pandas( 166s reader, 166s ...<10 lines>... 166s **kwargs, 166s ) 166s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 166s raise OSError(f"{urlpath} resolved to no files") 166s OSError: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/cli/server/tests//entry1_*.csv resolved to no files 166s 166s The above exception was the direct cause of the following exception: 166s 166s Traceback (most recent call last): 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/cli/server/server.py", line 306, in post 166s source.discover() 166s ~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/base.py", line 347, in discover 166s self._load_metadata() 166s ~~~~~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/base.py", line 285, in _load_metadata 166s self._schema = self._get_schema() 166s ~~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/csv.py", line 115, in _get_schema 166s self._open_dataset(urlpath) 166s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/csv.py", line 94, in _open_dataset 166s self._dataframe = dask.dataframe.read_csv( 166s ~~~~~~~~~~~~~~~~~~~~~~~^ 166s urlpath, storage_options=self._storage_options, 166s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 166s **self._csv_kwargs) 166s ^^^^^^^^^^^^^^^^^^^ 166s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 166s raise exc from e 166s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 166s Original Message: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/cli/server/tests//entry1_*.csv resolved to no files 166s ------------------------------ Captured log call ------------------------------- 166s WARNING tornado.general:web.py:1932 400 POST /v1/source (127.0.0.1): Discover failed 166s WARNING tornado.access:web.py:2407 400 POST /v1/source (127.0.0.1) 5.75ms 166s ______________________ TestServerV1Source.test_no_format _______________________ 166s 166s self = 166s 166s def test_no_format(self): 166s msg = dict(action='open', name='entry1', parameters={}) 166s > resp_msg, = self.make_post_request(msg) 166s 166s intake/cli/server/tests/test_server.py:195: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s intake/cli/server/tests/test_server.py:96: in make_post_request 166s self.assertEqual(response.code, expected_status) 166s E AssertionError: 400 != 200 166s ----------------------------- Captured stderr call ----------------------------- 166s Traceback (most recent call last): 166s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 166s return func(*args, **kwargs) 166s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 166s return read_pandas( 166s reader, 166s ...<10 lines>... 166s **kwargs, 166s ) 166s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 166s raise OSError(f"{urlpath} resolved to no files") 166s OSError: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/cli/server/tests//entry1_*.csv resolved to no files 166s 166s The above exception was the direct cause of the following exception: 166s 166s Traceback (most recent call last): 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/cli/server/server.py", line 306, in post 166s source.discover() 166s ~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/base.py", line 347, in discover 166s self._load_metadata() 166s ~~~~~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/base.py", line 285, in _load_metadata 166s self._schema = self._get_schema() 166s ~~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/csv.py", line 115, in _get_schema 166s self._open_dataset(urlpath) 166s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/csv.py", line 94, in _open_dataset 166s self._dataframe = dask.dataframe.read_csv( 166s ~~~~~~~~~~~~~~~~~~~~~~~^ 166s urlpath, storage_options=self._storage_options, 166s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 166s **self._csv_kwargs) 166s ^^^^^^^^^^^^^^^^^^^ 166s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 166s raise exc from e 166s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 166s Original Message: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/cli/server/tests//entry1_*.csv resolved to no files 166s ------------------------------ Captured log call ------------------------------- 166s WARNING tornado.general:web.py:1932 400 POST /v1/source (127.0.0.1): Discover failed 166s WARNING tornado.access:web.py:2407 400 POST /v1/source (127.0.0.1) 5.02ms 166s _________________________ TestServerV1Source.test_open _________________________ 166s 166s self = 166s 166s def test_open(self): 166s msg = dict(action='open', name='entry1', parameters={}) 166s > resp_msg, = self.make_post_request(msg) 166s 166s intake/cli/server/tests/test_server.py:112: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s intake/cli/server/tests/test_server.py:96: in make_post_request 166s self.assertEqual(response.code, expected_status) 166s E AssertionError: 400 != 200 166s ----------------------------- Captured stderr call ----------------------------- 166s Traceback (most recent call last): 166s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 166s return func(*args, **kwargs) 166s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 166s return read_pandas( 166s reader, 166s ...<10 lines>... 166s **kwargs, 166s ) 166s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 166s raise OSError(f"{urlpath} resolved to no files") 166s OSError: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/cli/server/tests//entry1_*.csv resolved to no files 166s 166s The above exception was the direct cause of the following exception: 166s 166s Traceback (most recent call last): 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/cli/server/server.py", line 306, in post 166s source.discover() 166s ~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/base.py", line 347, in discover 166s self._load_metadata() 166s ~~~~~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/base.py", line 285, in _load_metadata 166s self._schema = self._get_schema() 166s ~~~~~~~~~~~~~~~~^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/csv.py", line 115, in _get_schema 166s self._open_dataset(urlpath) 166s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 166s File "/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/csv.py", line 94, in _open_dataset 166s self._dataframe = dask.dataframe.read_csv( 166s ~~~~~~~~~~~~~~~~~~~~~~~^ 166s urlpath, storage_options=self._storage_options, 166s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 166s **self._csv_kwargs) 166s ^^^^^^^^^^^^^^^^^^^ 166s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 166s raise exc from e 166s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 166s Original Message: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/cli/server/tests//entry1_*.csv resolved to no files 166s ------------------------------ Captured log call ------------------------------- 166s WARNING tornado.general:web.py:1932 400 POST /v1/source (127.0.0.1): Discover failed 166s WARNING tornado.access:web.py:2407 400 POST /v1/source (127.0.0.1) 5.03ms 166s ________________________________ test_other_cat ________________________________ 166s 166s args = ('/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/tests/../../catalog/tests//entry1_*.csv',) 166s kwargs = {'storage_options': None} 166s func = .read at 0x624706178fe0> 166s exc = OSError('An error occurred while calling the read_csv method registered to the pandas backend.\nOriginal Message: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/tests/../../catalog/tests//entry1_*.csv resolved to no files') 166s 166s @wraps(fn) 166s def wrapper(*args, **kwargs): 166s func = getattr(self, dispatch_name) 166s try: 166s > return func(*args, **kwargs) 166s 166s /usr/lib/python3/dist-packages/dask/backends.py:140: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s /usr/lib/python3/dist-packages/dask/dataframe/io/csv.py:877: in read 166s return read_pandas( 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s reader = 166s urlpath = '/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/tests/../../catalog/tests//entry1_*.csv' 166s blocksize = 'default', lineterminator = '\n', compression = 'infer' 166s sample = 256000, sample_rows = 10, enforce = False, assume_missing = False 166s storage_options = None, include_path_column = False, kwargs = {} 166s reader_name = 'read_csv', b_lineterminator = b'\n', kw = 'chunksize' 166s lastskiprow = 0, firstrow = 0 166s 166s def read_pandas( 166s reader, 166s urlpath, 166s blocksize="default", 166s lineterminator=None, 166s compression="infer", 166s sample=256000, 166s sample_rows=10, 166s enforce=False, 166s assume_missing=False, 166s storage_options=None, 166s include_path_column=False, 166s **kwargs, 166s ): 166s reader_name = reader.__name__ 166s if lineterminator is not None and len(lineterminator) == 1: 166s kwargs["lineterminator"] = lineterminator 166s else: 166s lineterminator = "\n" 166s if "encoding" in kwargs: 166s b_lineterminator = lineterminator.encode(kwargs["encoding"]) 166s empty_blob = "".encode(kwargs["encoding"]) 166s if empty_blob: 166s # This encoding starts with a Byte Order Mark (BOM), so strip that from the 166s # start of the line terminator, since this value is not a full file. 166s b_lineterminator = b_lineterminator[len(empty_blob) :] 166s else: 166s b_lineterminator = lineterminator.encode() 166s if include_path_column and isinstance(include_path_column, bool): 166s include_path_column = "path" 166s if "index" in kwargs or ( 166s "index_col" in kwargs and kwargs.get("index_col") is not False 166s ): 166s raise ValueError( 166s "Keywords 'index' and 'index_col' not supported, except for " 166s "'index_col=False'. Use dd.{reader_name}(...).set_index('my-index') instead" 166s ) 166s for kw in ["iterator", "chunksize"]: 166s if kw in kwargs: 166s raise ValueError(f"{kw} not supported for dd.{reader_name}") 166s if kwargs.get("nrows", None): 166s raise ValueError( 166s "The 'nrows' keyword is not supported by " 166s "`dd.{0}`. To achieve the same behavior, it's " 166s "recommended to use `dd.{0}(...)." 166s "head(n=nrows)`".format(reader_name) 166s ) 166s if isinstance(kwargs.get("skiprows"), int): 166s lastskiprow = firstrow = kwargs.get("skiprows") 166s elif kwargs.get("skiprows") is None: 166s lastskiprow = firstrow = 0 166s else: 166s # When skiprows is a list, we expect more than max(skiprows) to 166s # be included in the sample. This means that [0,2] will work well, 166s # but [0, 440] might not work. 166s skiprows = set(kwargs.get("skiprows")) 166s lastskiprow = max(skiprows) 166s # find the firstrow that is not skipped, for use as header 166s firstrow = min(set(range(len(skiprows) + 1)) - set(skiprows)) 166s if isinstance(kwargs.get("header"), list): 166s raise TypeError(f"List of header rows not supported for dd.{reader_name}") 166s if isinstance(kwargs.get("converters"), dict) and include_path_column: 166s path_converter = kwargs.get("converters").get(include_path_column, None) 166s else: 166s path_converter = None 166s 166s # If compression is "infer", inspect the (first) path suffix and 166s # set the proper compression option if the suffix is recognized. 166s if compression == "infer": 166s # Translate the input urlpath to a simple path list 166s paths = get_fs_token_paths(urlpath, mode="rb", storage_options=storage_options)[ 166s 2 166s ] 166s 166s # Check for at least one valid path 166s if len(paths) == 0: 166s > raise OSError(f"{urlpath} resolved to no files") 166s E OSError: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/tests/../../catalog/tests//entry1_*.csv resolved to no files 166s 166s /usr/lib/python3/dist-packages/dask/dataframe/io/csv.py:644: OSError 166s 166s The above exception was the direct cause of the following exception: 166s 166s def test_other_cat(): 166s cat = intake.open_catalog(catfile) 166s > df1 = cat.other_cat.read() 166s 166s intake/source/tests/test_derived.py:35: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s intake/source/derived.py:252: in read 166s return self.to_dask().compute() 166s intake/source/derived.py:239: in to_dask 166s self._df = self._transform(self._source.to_dask(), 166s intake/source/csv.py:133: in to_dask 166s self._get_schema() 166s intake/source/csv.py:115: in _get_schema 166s self._open_dataset(urlpath) 166s intake/source/csv.py:94: in _open_dataset 166s self._dataframe = dask.dataframe.read_csv( 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s args = ('/tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/tests/../../catalog/tests//entry1_*.csv',) 166s kwargs = {'storage_options': None} 166s func = .read at 0x624706178fe0> 166s exc = OSError('An error occurred while calling the read_csv method registered to the pandas backend.\nOriginal Message: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/tests/../../catalog/tests//entry1_*.csv resolved to no files') 166s 166s @wraps(fn) 166s def wrapper(*args, **kwargs): 166s func = getattr(self, dispatch_name) 166s try: 166s return func(*args, **kwargs) 166s except Exception as e: 166s try: 166s exc = type(e)( 166s f"An error occurred while calling the {funcname(func)} " 166s f"method registered to the {self.backend} backend.\n" 166s f"Original Message: {e}" 166s ) 166s except TypeError: 166s raise e 166s else: 166s > raise exc from e 166s E OSError: An error occurred while calling the read_csv method registered to the pandas backend. 166s E Original Message: /tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/tests/../../catalog/tests//entry1_*.csv resolved to no files 166s 166s /usr/lib/python3/dist-packages/dask/backends.py:151: OSError 166s ______________________________ test_text_persist _______________________________ 166s 166s temp_cache = None 166s 166s def test_text_persist(temp_cache): 166s cat = intake.open_catalog(os.path.join(here, 'sources.yaml')) 166s s = cat.sometext() 166s > s2 = s.persist() 166s 166s intake/source/tests/test_text.py:88: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s intake/source/base.py:226: in persist 166s out = self._export(store.getdir(self), **kwargs) 166s intake/source/base.py:460: in _export 166s out = method(self, path=path, **kwargs) 166s intake/container/semistructured.py:70: in _persist 166s return RemoteSequenceSource._data_to_source(b, path, encoder, **kwargs) 166s intake/container/semistructured.py:90: in _data_to_source 166s files = open_files(posixpath.join(path, 'part.*'), mode='wt', 166s /usr/lib/python3/dist-packages/fsspec/core.py:295: in open_files 166s fs, fs_token, paths = get_fs_token_paths( 166s /usr/lib/python3/dist-packages/fsspec/core.py:684: in get_fs_token_paths 166s paths = _expand_paths(paths, name_function, num) 166s /usr/lib/python3/dist-packages/fsspec/core.py:701: in _expand_paths 166s name_function = build_name_function(num - 1) 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s max_int = -0.99999999 166s 166s def build_name_function(max_int: float) -> Callable[[int], str]: 166s """Returns a function that receives a single integer 166s and returns it as a string padded by enough zero characters 166s to align with maximum possible integer 166s 166s >>> name_f = build_name_function(57) 166s 166s >>> name_f(7) 166s '07' 166s >>> name_f(31) 166s '31' 166s >>> build_name_function(1000)(42) 166s '0042' 166s >>> build_name_function(999)(42) 166s '042' 166s >>> build_name_function(0)(0) 166s '0' 166s """ 166s # handle corner cases max_int is 0 or exact power of 10 166s max_int += 1e-8 166s 166s > pad_length = int(math.ceil(math.log10(max_int))) 166s E ValueError: math domain error 166s 166s /usr/lib/python3/dist-packages/fsspec/utils.py:177: ValueError 166s _______________________________ test_text_export _______________________________ 166s 166s temp_cache = None 166s 166s def test_text_export(temp_cache): 166s import tempfile 166s outdir = tempfile.mkdtemp() 166s cat = intake.open_catalog(os.path.join(here, 'sources.yaml')) 166s s = cat.sometext() 166s > out = s.export(outdir) 166s 166s intake/source/tests/test_text.py:97: 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s intake/source/base.py:452: in export 166s return self._export(path, **kwargs) 166s intake/source/base.py:460: in _export 166s out = method(self, path=path, **kwargs) 166s intake/container/semistructured.py:70: in _persist 166s return RemoteSequenceSource._data_to_source(b, path, encoder, **kwargs) 166s intake/container/semistructured.py:90: in _data_to_source 166s files = open_files(posixpath.join(path, 'part.*'), mode='wt', 166s /usr/lib/python3/dist-packages/fsspec/core.py:295: in open_files 166s fs, fs_token, paths = get_fs_token_paths( 166s /usr/lib/python3/dist-packages/fsspec/core.py:684: in get_fs_token_paths 166s paths = _expand_paths(paths, name_function, num) 166s /usr/lib/python3/dist-packages/fsspec/core.py:701: in _expand_paths 166s name_function = build_name_function(num - 1) 166s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 166s 166s max_int = -0.99999999 166s 166s def build_name_function(max_int: float) -> Callable[[int], str]: 166s """Returns a function that receives a single integer 166s and returns it as a string padded by enough zero characters 166s to align with maximum possible integer 166s 166s >>> name_f = build_name_function(57) 166s 166s >>> name_f(7) 166s '07' 166s >>> name_f(31) 166s '31' 166s >>> build_name_function(1000)(42) 166s '0042' 166s >>> build_name_function(999)(42) 166s '042' 166s >>> build_name_function(0)(0) 166s '0' 166s """ 166s # handle corner cases max_int is 0 or exact power of 10 166s max_int += 1e-8 166s 166s > pad_length = int(math.ceil(math.log10(max_int))) 166s E ValueError: math domain error 166s 166s /usr/lib/python3/dist-packages/fsspec/utils.py:177: ValueError 166s =============================== warnings summary =============================== 166s intake/catalog/tests/test_alias.py::test_simple 166s /usr/lib/python3/dist-packages/dask/dataframe/__init__.py:49: FutureWarning: 166s Dask dataframe query planning is disabled because dask-expr is not installed. 166s 166s You can install it with `pip install dask[dataframe]` or `conda install dask`. 166s This will raise in a future version. 166s 166s warnings.warn(msg, FutureWarning) 166s 166s intake/source/tests/test_cache.py::test_filtered_compressed_cache 166s intake/source/tests/test_cache.py::test_compressions[tgz] 166s intake/source/tests/test_cache.py::test_compressions[tgz] 166s /tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/decompress.py:27: DeprecationWarning: Python 3.14 will, by default, filter extracted tar archives and reject files or modify their metadata. Use the filter argument to control this behavior. 166s tar.extractall(outpath) 166s 166s intake/source/tests/test_cache.py::test_compressions[tbz] 166s intake/source/tests/test_cache.py::test_compressions[tbz] 166s /tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/decompress.py:37: DeprecationWarning: Python 3.14 will, by default, filter extracted tar archives and reject files or modify their metadata. Use the filter argument to control this behavior. 166s tar.extractall(outpath) 166s 166s intake/source/tests/test_cache.py::test_compressions[tar] 166s intake/source/tests/test_cache.py::test_compressions[tar] 166s /tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/decompress.py:47: DeprecationWarning: Python 3.14 will, by default, filter extracted tar archives and reject files or modify their metadata. Use the filter argument to control this behavior. 166s tar.extractall(outpath) 166s 166s intake/source/tests/test_discovery.py::test_package_scan 166s intake/source/tests/test_discovery.py::test_package_scan 166s intake/source/tests/test_discovery.py::test_enable_and_disable 166s intake/source/tests/test_discovery.py::test_discover_collision 166s /tmp/autopkgtest.sDpKou/build.wv0/src/intake/source/discovery.py:194: PendingDeprecationWarning: Package scanning may be removed 166s warnings.warn("Package scanning may be removed", category=PendingDeprecationWarning) 166s 166s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 166s =========================== short test summary info ============================ 166s FAILED intake/catalog/tests/test_caching_integration.py::test_load_textfile 166s FAILED intake/catalog/tests/test_local.py::test_nested - OSError: An error oc... 166s FAILED intake/catalog/tests/test_remote_integration.py::test_info_describe - ... 166s FAILED intake/catalog/tests/test_remote_integration.py::test_remote_direct - ... 166s FAILED intake/catalog/tests/test_remote_integration.py::test_remote_datasource_interface 166s FAILED intake/catalog/tests/test_remote_integration.py::test_read - Exception... 166s FAILED intake/catalog/tests/test_remote_integration.py::test_read_chunks - Ex... 166s FAILED intake/catalog/tests/test_remote_integration.py::test_read_partition 166s FAILED intake/catalog/tests/test_remote_integration.py::test_close - Exceptio... 166s FAILED intake/catalog/tests/test_remote_integration.py::test_with - Exception... 166s FAILED intake/catalog/tests/test_remote_integration.py::test_pickle - Excepti... 166s FAILED intake/catalog/tests/test_remote_integration.py::test_to_dask - Except... 166s FAILED intake/catalog/tests/test_remote_integration.py::test_remote_sequence 166s FAILED intake/catalog/tests/test_remote_integration.py::test_dir - TypeError:... 166s FAILED intake/cli/client/tests/test_local_integration.py::test_discover - ass... 166s FAILED intake/cli/client/tests/test_local_integration.py::test_get_pass - Ass... 166s FAILED intake/cli/server/tests/test_server.py::TestServerV1Source::test_idle_timer 166s FAILED intake/cli/server/tests/test_server.py::TestServerV1Source::test_no_format 166s FAILED intake/cli/server/tests/test_server.py::TestServerV1Source::test_open 166s FAILED intake/source/tests/test_derived.py::test_other_cat - OSError: An erro... 166s FAILED intake/source/tests/test_text.py::test_text_persist - ValueError: math... 166s FAILED intake/source/tests/test_text.py::test_text_export - ValueError: math ... 166s ERROR intake/interface/tests/test_init_gui.py::test_no_panel_does_not_raise_errors 166s ERROR intake/interface/tests/test_init_gui.py::test_no_panel_display_init_gui 166s ERROR intake/interface/tests/test_init_gui.py::test_display_init_gui - KeyErr... 166s ====== 22 failed, 379 passed, 31 skipped, 12 warnings, 3 errors in 42.67s ====== 166s autopkgtest [19:03:32]: test run-unit-test: -----------------------] 167s run-unit-test FAIL non-zero exit status 1 167s autopkgtest [19:03:33]: test run-unit-test: - - - - - - - - - - results - - - - - - - - - - 167s autopkgtest [19:03:33]: @@@@@@@@@@@@@@@@@@@@ summary 167s run-unit-test FAIL non-zero exit status 1