0s autopkgtest [16:41:40]: starting date and time: 2025-11-17 16:41:40+0000 0s autopkgtest [16:41:40]: git checkout: 4b346b80 nova: make wait_reboot return success even when a no-op 0s autopkgtest [16:41:40]: host juju-7f2275-prod-proposed-migration-environment-2; command line: /home/ubuntu/autopkgtest/runner/autopkgtest --output-dir /tmp/autopkgtest-work.gkr6eysl/out --timeout-copy=6000 --setup-commands /home/ubuntu/autopkgtest-cloud/worker-config-production/setup-canonical.sh --apt-pocket=proposed=src:intake --apt-upgrade intake --timeout-short=300 --timeout-copy=20000 --timeout-build=20000 --env=ADT_TEST_TRIGGERS=intake/0.6.6-4 -- ssh -s /home/ubuntu/autopkgtest/ssh-setup/nova -- --flavor autopkgtest-s390x --security-groups autopkgtest-juju-7f2275-prod-proposed-migration-environment-2@bos03-s390x-14.secgroup --name adt-resolute-s390x-intake-20251117-164140-juju-7f2275-prod-proposed-migration-environment-2-65d936dc-de7c-4bf3-a6fe-07f1ea663cbf --image adt/ubuntu-resolute-s390x-server --keyname testbed-juju-7f2275-prod-proposed-migration-environment-2 --net-id=net_prod-proposed-migration-s390x -e TERM=linux --mirror=http://ftpmaster.internal/ubuntu/ 4s Creating nova instance adt-resolute-s390x-intake-20251117-164140-juju-7f2275-prod-proposed-migration-environment-2-65d936dc-de7c-4bf3-a6fe-07f1ea663cbf from image adt/ubuntu-resolute-s390x-server-20251117.img (UUID a3a3e3b9-e6ba-478c-a5e9-fce6f0982a95)... 51s autopkgtest [16:42:31]: testbed dpkg architecture: s390x 51s autopkgtest [16:42:31]: testbed apt version: 3.1.11 51s autopkgtest [16:42:31]: @@@@@@@@@@@@@@@@@@@@ test bed setup 52s autopkgtest [16:42:32]: testbed release detected to be: None 52s autopkgtest [16:42:32]: updating testbed package index (apt update) 53s Get:1 http://ftpmaster.internal/ubuntu resolute-proposed InRelease [87.8 kB] 53s Hit:2 http://ftpmaster.internal/ubuntu resolute InRelease 53s Hit:3 http://ftpmaster.internal/ubuntu resolute-updates InRelease 53s Hit:4 http://ftpmaster.internal/ubuntu resolute-security InRelease 53s Get:5 http://ftpmaster.internal/ubuntu resolute-proposed/main Sources [73.2 kB] 53s Get:6 http://ftpmaster.internal/ubuntu resolute-proposed/universe Sources [779 kB] 54s Get:7 http://ftpmaster.internal/ubuntu resolute-proposed/restricted Sources [9852 B] 54s Get:8 http://ftpmaster.internal/ubuntu resolute-proposed/multiverse Sources [22.9 kB] 54s Get:9 http://ftpmaster.internal/ubuntu resolute-proposed/main s390x Packages [134 kB] 54s Get:10 http://ftpmaster.internal/ubuntu resolute-proposed/restricted s390x Packages [940 B] 54s Get:11 http://ftpmaster.internal/ubuntu resolute-proposed/universe s390x Packages [488 kB] 54s Get:12 http://ftpmaster.internal/ubuntu resolute-proposed/multiverse s390x Packages [10.6 kB] 54s Fetched 1606 kB in 1s (1380 kB/s) 54s Reading package lists... 55s Hit:1 http://ftpmaster.internal/ubuntu resolute-proposed InRelease 55s Hit:2 http://ftpmaster.internal/ubuntu resolute InRelease 55s Hit:3 http://ftpmaster.internal/ubuntu resolute-updates InRelease 55s Hit:4 http://ftpmaster.internal/ubuntu resolute-security InRelease 56s Reading package lists... 56s Reading package lists... 56s Building dependency tree... 56s Reading state information... 56s Calculating upgrade... 56s The following packages will be upgraded: 56s apt libapt-pkg7.0 libcrypt-dev libcrypt1 usbutils 56s 5 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 56s Need to get 2926 kB of archives. 56s After this operation, 50.2 kB of additional disk space will be used. 56s Get:1 http://ftpmaster.internal/ubuntu resolute/main s390x libcrypt-dev s390x 1:4.5.1-1 [127 kB] 57s Get:2 http://ftpmaster.internal/ubuntu resolute/main s390x libcrypt1 s390x 1:4.5.1-1 [96.1 kB] 57s Get:3 http://ftpmaster.internal/ubuntu resolute/main s390x libapt-pkg7.0 s390x 3.1.12 [1150 kB] 57s Get:4 http://ftpmaster.internal/ubuntu resolute/main s390x apt s390x 3.1.12 [1468 kB] 57s Get:5 http://ftpmaster.internal/ubuntu resolute/main s390x usbutils s390x 1:019-1 [85.6 kB] 57s dpkg-preconfigure: unable to re-open stdin: No such file or directory 58s Fetched 2926 kB in 1s (3055 kB/s) 58s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 61309 files and directories currently installed.) 58s Preparing to unpack .../libcrypt-dev_1%3a4.5.1-1_s390x.deb ... 58s Unpacking libcrypt-dev:s390x (1:4.5.1-1) over (1:4.4.38-1build1) ... 58s Preparing to unpack .../libcrypt1_1%3a4.5.1-1_s390x.deb ... 58s Unpacking libcrypt1:s390x (1:4.5.1-1) over (1:4.4.38-1build1) ... 58s Setting up libcrypt1:s390x (1:4.5.1-1) ... 58s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 61309 files and directories currently installed.) 58s Preparing to unpack .../libapt-pkg7.0_3.1.12_s390x.deb ... 58s Unpacking libapt-pkg7.0:s390x (3.1.12) over (3.1.11) ... 58s Preparing to unpack .../archives/apt_3.1.12_s390x.deb ... 58s Unpacking apt (3.1.12) over (3.1.11) ... 58s Preparing to unpack .../usbutils_1%3a019-1_s390x.deb ... 58s Unpacking usbutils (1:019-1) over (1:018-2) ... 58s Setting up usbutils (1:019-1) ... 58s Setting up libcrypt-dev:s390x (1:4.5.1-1) ... 58s Setting up libapt-pkg7.0:s390x (3.1.12) ... 58s Setting up apt (3.1.12) ... 59s Processing triggers for man-db (2.13.1-1) ... 60s Processing triggers for libc-bin (2.42-2ubuntu2) ... 60s autopkgtest [16:42:40]: upgrading testbed (apt dist-upgrade and autopurge) 61s Reading package lists... 61s Building dependency tree... 61s Reading state information... 61s Calculating upgrade... 61s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 62s Reading package lists... 62s Building dependency tree... 62s Reading state information... 62s Solving dependencies... 62s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 62s autopkgtest [16:42:42]: rebooting testbed after setup commands that affected boot 78s autopkgtest [16:42:58]: testbed running kernel: Linux 6.17.0-5-generic #5-Ubuntu SMP Mon Sep 22 08:56:47 UTC 2025 80s autopkgtest [16:43:00]: @@@@@@@@@@@@@@@@@@@@ apt-source intake 83s Get:1 http://ftpmaster.internal/ubuntu resolute-proposed/universe intake 0.6.6-4 (dsc) [2693 B] 83s Get:2 http://ftpmaster.internal/ubuntu resolute-proposed/universe intake 0.6.6-4 (tar) [4447 kB] 83s Get:3 http://ftpmaster.internal/ubuntu resolute-proposed/universe intake 0.6.6-4 (diff) [15.8 kB] 83s gpgv: Signature made Wed Aug 27 08:46:02 2025 UTC 83s gpgv: using RSA key 8F6DE104377F3B11E741748731F3144544A1741A 83s gpgv: issuer "tchet@debian.org" 83s gpgv: Can't check signature: No public key 83s dpkg-source: warning: cannot verify inline signature for ./intake_0.6.6-4.dsc: no acceptable signature found 83s autopkgtest [16:43:03]: testing package intake version 0.6.6-4 83s autopkgtest [16:43:03]: build not needed 84s autopkgtest [16:43:04]: test run-unit-test: preparing testbed 84s Reading package lists... 85s Building dependency tree... 85s Reading state information... 85s Solving dependencies... 85s The following NEW packages will be installed: 85s fonts-font-awesome fonts-glyphicons-halflings fonts-lato libblas3 85s libgfortran5 libjs-bootstrap libjs-jquery libjs-sphinxdoc libjs-underscore 85s liblapack3 node-html5shiv python3-aiohappyeyeballs python3-aiohttp 85s python3-aiosignal python3-all python3-async-timeout python3-click 85s python3-cloudpickle python3-dask python3-entrypoints python3-frozenlist 85s python3-fsspec python3-iniconfig python3-intake python3-intake-doc 85s python3-locket python3-msgpack python3-msgpack-numpy python3-multidict 85s python3-numpy python3-numpy-dev python3-pandas python3-pandas-lib 85s python3-partd python3-platformdirs python3-pluggy python3-propcache 85s python3-pytest python3-pytz python3-toolz python3-tornado python3-yarl 85s sphinx-rtd-theme-common 85s 0 upgraded, 43 newly installed, 0 to remove and 0 not upgraded. 85s Need to get 29.5 MB of archives. 85s After this operation, 145 MB of additional disk space will be used. 85s Get:1 http://ftpmaster.internal/ubuntu resolute/main s390x fonts-lato all 2.015-1 [2781 kB] 86s Get:2 http://ftpmaster.internal/ubuntu resolute/main s390x python3-numpy-dev s390x 1:2.2.4+ds-1ubuntu1 [147 kB] 86s Get:3 http://ftpmaster.internal/ubuntu resolute/main s390x libblas3 s390x 3.12.1-7 [254 kB] 86s Get:4 http://ftpmaster.internal/ubuntu resolute/main s390x libgfortran5 s390x 15.2.0-7ubuntu1 [629 kB] 86s Get:5 http://ftpmaster.internal/ubuntu resolute/main s390x liblapack3 s390x 3.12.1-7 [2983 kB] 86s Get:6 http://ftpmaster.internal/ubuntu resolute/main s390x python3-numpy s390x 1:2.2.4+ds-1ubuntu1 [4399 kB] 86s Get:7 http://ftpmaster.internal/ubuntu resolute/main s390x fonts-font-awesome all 5.0.10+really4.7.0~dfsg-4.1 [516 kB] 86s Get:8 http://ftpmaster.internal/ubuntu resolute/universe s390x fonts-glyphicons-halflings all 1.009~3.4.1+dfsg-6 [119 kB] 86s Get:9 http://ftpmaster.internal/ubuntu resolute/universe s390x libjs-bootstrap all 3.4.1+dfsg-6 [129 kB] 86s Get:10 http://ftpmaster.internal/ubuntu resolute/main s390x libjs-jquery all 3.6.1+dfsg+~3.5.14-1 [328 kB] 86s Get:11 http://ftpmaster.internal/ubuntu resolute/main s390x libjs-underscore all 1.13.4~dfsg+~1.11.4-3 [118 kB] 86s Get:12 http://ftpmaster.internal/ubuntu resolute/main s390x libjs-sphinxdoc all 8.2.3-1ubuntu2 [28.0 kB] 86s Get:13 http://ftpmaster.internal/ubuntu resolute/universe s390x node-html5shiv all 3.7.3+dfsg-5 [13.5 kB] 86s Get:14 http://ftpmaster.internal/ubuntu resolute/universe s390x python3-aiohappyeyeballs all 2.6.1-2 [11.1 kB] 86s Get:15 http://ftpmaster.internal/ubuntu resolute/universe s390x python3-multidict s390x 6.4.3-1build1 [73.5 kB] 86s Get:16 http://ftpmaster.internal/ubuntu resolute/universe s390x python3-propcache s390x 0.3.1-1build1 [56.4 kB] 86s Get:17 http://ftpmaster.internal/ubuntu resolute/universe s390x python3-yarl s390x 1.22.0-1 [105 kB] 86s Get:18 http://ftpmaster.internal/ubuntu resolute/universe s390x python3-async-timeout all 5.0.1-1 [6830 B] 86s Get:19 http://ftpmaster.internal/ubuntu resolute/universe s390x python3-frozenlist s390x 1.8.0-1 [58.2 kB] 86s Get:20 http://ftpmaster.internal/ubuntu resolute/universe s390x python3-aiosignal all 1.4.0-1 [5628 B] 87s Get:21 http://ftpmaster.internal/ubuntu resolute/universe s390x python3-aiohttp s390x 3.11.16-1 [369 kB] 87s Get:22 http://ftpmaster.internal/ubuntu resolute/main s390x python3-all s390x 3.13.7-1 [886 B] 87s Get:23 http://ftpmaster.internal/ubuntu resolute/main s390x python3-click all 8.2.0+0.really.8.1.8-1 [80.0 kB] 87s Get:24 http://ftpmaster.internal/ubuntu resolute/universe s390x python3-cloudpickle all 3.1.1-1 [22.4 kB] 87s Get:25 http://ftpmaster.internal/ubuntu resolute/universe s390x python3-fsspec all 2025.3.2-1ubuntu1 [217 kB] 87s Get:26 http://ftpmaster.internal/ubuntu resolute/universe s390x python3-toolz all 1.0.0-2 [45.0 kB] 87s Get:27 http://ftpmaster.internal/ubuntu resolute/universe s390x python3-locket all 1.0.0-2 [5872 B] 87s Get:28 http://ftpmaster.internal/ubuntu resolute/universe s390x python3-partd all 1.4.2-1 [15.7 kB] 87s Get:29 http://ftpmaster.internal/ubuntu resolute/universe s390x python3-dask all 2024.12.1+dfsg-2 [875 kB] 87s Get:30 http://ftpmaster.internal/ubuntu resolute/universe s390x python3-entrypoints all 0.4-3 [7174 B] 87s Get:31 http://ftpmaster.internal/ubuntu resolute/universe s390x python3-iniconfig all 2.1.0-1 [6840 B] 87s Get:32 http://ftpmaster.internal/ubuntu resolute/main s390x python3-msgpack s390x 1.0.3-3build5 [119 kB] 87s Get:33 http://ftpmaster.internal/ubuntu resolute/main s390x python3-platformdirs all 4.3.7-1 [16.9 kB] 87s Get:34 http://ftpmaster.internal/ubuntu resolute-proposed/universe s390x python3-intake s390x 0.6.6-4 [197 kB] 87s Get:35 http://ftpmaster.internal/ubuntu resolute/main s390x sphinx-rtd-theme-common all 3.0.2+dfsg-3 [1013 kB] 87s Get:36 http://ftpmaster.internal/ubuntu resolute-proposed/universe s390x python3-intake-doc all 0.6.6-4 [1549 kB] 87s Get:37 http://ftpmaster.internal/ubuntu resolute/universe s390x python3-msgpack-numpy all 0.4.8-1 [7388 B] 87s Get:38 http://ftpmaster.internal/ubuntu resolute/main s390x python3-pytz all 2025.2-4 [32.3 kB] 87s Get:39 http://ftpmaster.internal/ubuntu resolute/universe s390x python3-pandas-lib s390x 2.3.3+dfsg-1ubuntu1 [8668 kB] 87s Get:40 http://ftpmaster.internal/ubuntu resolute/universe s390x python3-pandas all 2.3.3+dfsg-1ubuntu1 [2948 kB] 87s Get:41 http://ftpmaster.internal/ubuntu resolute/universe s390x python3-pluggy all 1.6.0-1 [21.0 kB] 87s Get:42 http://ftpmaster.internal/ubuntu resolute/universe s390x python3-pytest all 8.3.5-2 [252 kB] 88s Get:43 http://ftpmaster.internal/ubuntu resolute/main s390x python3-tornado s390x 6.5.2-3 [304 kB] 88s Fetched 29.5 MB in 3s (10.9 MB/s) 88s Selecting previously unselected package fonts-lato. 88s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 61309 files and directories currently installed.) 88s Preparing to unpack .../00-fonts-lato_2.015-1_all.deb ... 88s Unpacking fonts-lato (2.015-1) ... 88s Selecting previously unselected package python3-numpy-dev:s390x. 88s Preparing to unpack .../01-python3-numpy-dev_1%3a2.2.4+ds-1ubuntu1_s390x.deb ... 88s Unpacking python3-numpy-dev:s390x (1:2.2.4+ds-1ubuntu1) ... 88s Selecting previously unselected package libblas3:s390x. 88s Preparing to unpack .../02-libblas3_3.12.1-7_s390x.deb ... 88s Unpacking libblas3:s390x (3.12.1-7) ... 88s Selecting previously unselected package libgfortran5:s390x. 88s Preparing to unpack .../03-libgfortran5_15.2.0-7ubuntu1_s390x.deb ... 88s Unpacking libgfortran5:s390x (15.2.0-7ubuntu1) ... 88s Selecting previously unselected package liblapack3:s390x. 88s Preparing to unpack .../04-liblapack3_3.12.1-7_s390x.deb ... 88s Unpacking liblapack3:s390x (3.12.1-7) ... 88s Selecting previously unselected package python3-numpy. 88s Preparing to unpack .../05-python3-numpy_1%3a2.2.4+ds-1ubuntu1_s390x.deb ... 88s Unpacking python3-numpy (1:2.2.4+ds-1ubuntu1) ... 88s Selecting previously unselected package fonts-font-awesome. 88s Preparing to unpack .../06-fonts-font-awesome_5.0.10+really4.7.0~dfsg-4.1_all.deb ... 88s Unpacking fonts-font-awesome (5.0.10+really4.7.0~dfsg-4.1) ... 88s Selecting previously unselected package fonts-glyphicons-halflings. 88s Preparing to unpack .../07-fonts-glyphicons-halflings_1.009~3.4.1+dfsg-6_all.deb ... 88s Unpacking fonts-glyphicons-halflings (1.009~3.4.1+dfsg-6) ... 88s Selecting previously unselected package libjs-bootstrap. 88s Preparing to unpack .../08-libjs-bootstrap_3.4.1+dfsg-6_all.deb ... 88s Unpacking libjs-bootstrap (3.4.1+dfsg-6) ... 88s Selecting previously unselected package libjs-jquery. 88s Preparing to unpack .../09-libjs-jquery_3.6.1+dfsg+~3.5.14-1_all.deb ... 88s Unpacking libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 89s Selecting previously unselected package libjs-underscore. 89s Preparing to unpack .../10-libjs-underscore_1.13.4~dfsg+~1.11.4-3_all.deb ... 89s Unpacking libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... 89s Selecting previously unselected package libjs-sphinxdoc. 89s Preparing to unpack .../11-libjs-sphinxdoc_8.2.3-1ubuntu2_all.deb ... 89s Unpacking libjs-sphinxdoc (8.2.3-1ubuntu2) ... 89s Selecting previously unselected package node-html5shiv. 89s Preparing to unpack .../12-node-html5shiv_3.7.3+dfsg-5_all.deb ... 89s Unpacking node-html5shiv (3.7.3+dfsg-5) ... 89s Selecting previously unselected package python3-aiohappyeyeballs. 89s Preparing to unpack .../13-python3-aiohappyeyeballs_2.6.1-2_all.deb ... 89s Unpacking python3-aiohappyeyeballs (2.6.1-2) ... 89s Selecting previously unselected package python3-multidict. 89s Preparing to unpack .../14-python3-multidict_6.4.3-1build1_s390x.deb ... 89s Unpacking python3-multidict (6.4.3-1build1) ... 89s Selecting previously unselected package python3-propcache. 89s Preparing to unpack .../15-python3-propcache_0.3.1-1build1_s390x.deb ... 89s Unpacking python3-propcache (0.3.1-1build1) ... 89s Selecting previously unselected package python3-yarl. 89s Preparing to unpack .../16-python3-yarl_1.22.0-1_s390x.deb ... 89s Unpacking python3-yarl (1.22.0-1) ... 89s Selecting previously unselected package python3-async-timeout. 89s Preparing to unpack .../17-python3-async-timeout_5.0.1-1_all.deb ... 89s Unpacking python3-async-timeout (5.0.1-1) ... 89s Selecting previously unselected package python3-frozenlist. 89s Preparing to unpack .../18-python3-frozenlist_1.8.0-1_s390x.deb ... 89s Unpacking python3-frozenlist (1.8.0-1) ... 89s Selecting previously unselected package python3-aiosignal. 89s Preparing to unpack .../19-python3-aiosignal_1.4.0-1_all.deb ... 89s Unpacking python3-aiosignal (1.4.0-1) ... 89s Selecting previously unselected package python3-aiohttp. 89s Preparing to unpack .../20-python3-aiohttp_3.11.16-1_s390x.deb ... 89s Unpacking python3-aiohttp (3.11.16-1) ... 89s Selecting previously unselected package python3-all. 89s Preparing to unpack .../21-python3-all_3.13.7-1_s390x.deb ... 89s Unpacking python3-all (3.13.7-1) ... 89s Selecting previously unselected package python3-click. 89s Preparing to unpack .../22-python3-click_8.2.0+0.really.8.1.8-1_all.deb ... 89s Unpacking python3-click (8.2.0+0.really.8.1.8-1) ... 89s Selecting previously unselected package python3-cloudpickle. 89s Preparing to unpack .../23-python3-cloudpickle_3.1.1-1_all.deb ... 89s Unpacking python3-cloudpickle (3.1.1-1) ... 89s Selecting previously unselected package python3-fsspec. 89s Preparing to unpack .../24-python3-fsspec_2025.3.2-1ubuntu1_all.deb ... 89s Unpacking python3-fsspec (2025.3.2-1ubuntu1) ... 89s Selecting previously unselected package python3-toolz. 89s Preparing to unpack .../25-python3-toolz_1.0.0-2_all.deb ... 89s Unpacking python3-toolz (1.0.0-2) ... 89s Selecting previously unselected package python3-locket. 89s Preparing to unpack .../26-python3-locket_1.0.0-2_all.deb ... 89s Unpacking python3-locket (1.0.0-2) ... 89s Selecting previously unselected package python3-partd. 89s Preparing to unpack .../27-python3-partd_1.4.2-1_all.deb ... 89s Unpacking python3-partd (1.4.2-1) ... 89s Selecting previously unselected package python3-dask. 89s Preparing to unpack .../28-python3-dask_2024.12.1+dfsg-2_all.deb ... 89s Unpacking python3-dask (2024.12.1+dfsg-2) ... 89s Selecting previously unselected package python3-entrypoints. 89s Preparing to unpack .../29-python3-entrypoints_0.4-3_all.deb ... 89s Unpacking python3-entrypoints (0.4-3) ... 89s Selecting previously unselected package python3-iniconfig. 89s Preparing to unpack .../30-python3-iniconfig_2.1.0-1_all.deb ... 89s Unpacking python3-iniconfig (2.1.0-1) ... 89s Selecting previously unselected package python3-msgpack. 89s Preparing to unpack .../31-python3-msgpack_1.0.3-3build5_s390x.deb ... 89s Unpacking python3-msgpack (1.0.3-3build5) ... 89s Selecting previously unselected package python3-platformdirs. 89s Preparing to unpack .../32-python3-platformdirs_4.3.7-1_all.deb ... 89s Unpacking python3-platformdirs (4.3.7-1) ... 89s Selecting previously unselected package python3-intake. 89s Preparing to unpack .../33-python3-intake_0.6.6-4_s390x.deb ... 89s Unpacking python3-intake (0.6.6-4) ... 89s Selecting previously unselected package sphinx-rtd-theme-common. 89s Preparing to unpack .../34-sphinx-rtd-theme-common_3.0.2+dfsg-3_all.deb ... 89s Unpacking sphinx-rtd-theme-common (3.0.2+dfsg-3) ... 89s Selecting previously unselected package python3-intake-doc. 89s Preparing to unpack .../35-python3-intake-doc_0.6.6-4_all.deb ... 89s Unpacking python3-intake-doc (0.6.6-4) ... 89s Selecting previously unselected package python3-msgpack-numpy. 89s Preparing to unpack .../36-python3-msgpack-numpy_0.4.8-1_all.deb ... 89s Unpacking python3-msgpack-numpy (0.4.8-1) ... 89s Selecting previously unselected package python3-pytz. 89s Preparing to unpack .../37-python3-pytz_2025.2-4_all.deb ... 89s Unpacking python3-pytz (2025.2-4) ... 89s Selecting previously unselected package python3-pandas-lib:s390x. 89s Preparing to unpack .../38-python3-pandas-lib_2.3.3+dfsg-1ubuntu1_s390x.deb ... 89s Unpacking python3-pandas-lib:s390x (2.3.3+dfsg-1ubuntu1) ... 89s Selecting previously unselected package python3-pandas. 89s Preparing to unpack .../39-python3-pandas_2.3.3+dfsg-1ubuntu1_all.deb ... 89s Unpacking python3-pandas (2.3.3+dfsg-1ubuntu1) ... 89s Selecting previously unselected package python3-pluggy. 89s Preparing to unpack .../40-python3-pluggy_1.6.0-1_all.deb ... 89s Unpacking python3-pluggy (1.6.0-1) ... 89s Selecting previously unselected package python3-pytest. 89s Preparing to unpack .../41-python3-pytest_8.3.5-2_all.deb ... 89s Unpacking python3-pytest (8.3.5-2) ... 89s Selecting previously unselected package python3-tornado. 89s Preparing to unpack .../42-python3-tornado_6.5.2-3_s390x.deb ... 89s Unpacking python3-tornado (6.5.2-3) ... 89s Setting up python3-entrypoints (0.4-3) ... 89s Setting up python3-iniconfig (2.1.0-1) ... 89s Setting up python3-tornado (6.5.2-3) ... 90s Setting up fonts-lato (2.015-1) ... 90s Setting up fonts-glyphicons-halflings (1.009~3.4.1+dfsg-6) ... 90s Setting up python3-fsspec (2025.3.2-1ubuntu1) ... 90s Setting up node-html5shiv (3.7.3+dfsg-5) ... 90s Setting up python3-all (3.13.7-1) ... 90s Setting up python3-pytz (2025.2-4) ... 90s Setting up python3-click (8.2.0+0.really.8.1.8-1) ... 90s Setting up python3-platformdirs (4.3.7-1) ... 90s Setting up python3-multidict (6.4.3-1build1) ... 90s Setting up python3-cloudpickle (3.1.1-1) ... 90s Setting up python3-frozenlist (1.8.0-1) ... 90s Setting up python3-aiosignal (1.4.0-1) ... 91s Setting up python3-async-timeout (5.0.1-1) ... 91s Setting up libblas3:s390x (3.12.1-7) ... 91s update-alternatives: using /usr/lib/s390x-linux-gnu/blas/libblas.so.3 to provide /usr/lib/s390x-linux-gnu/libblas.so.3 (libblas.so.3-s390x-linux-gnu) in auto mode 91s Setting up python3-numpy-dev:s390x (1:2.2.4+ds-1ubuntu1) ... 91s Setting up python3-aiohappyeyeballs (2.6.1-2) ... 91s Setting up libgfortran5:s390x (15.2.0-7ubuntu1) ... 91s Setting up python3-pluggy (1.6.0-1) ... 91s Setting up python3-propcache (0.3.1-1build1) ... 91s Setting up python3-toolz (1.0.0-2) ... 91s Setting up python3-msgpack (1.0.3-3build5) ... 91s Setting up libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 91s Setting up python3-locket (1.0.0-2) ... 91s Setting up python3-yarl (1.22.0-1) ... 91s Setting up fonts-font-awesome (5.0.10+really4.7.0~dfsg-4.1) ... 91s Setting up sphinx-rtd-theme-common (3.0.2+dfsg-3) ... 91s Setting up libjs-bootstrap (3.4.1+dfsg-6) ... 91s Setting up libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... 91s Setting up python3-partd (1.4.2-1) ... 91s Setting up liblapack3:s390x (3.12.1-7) ... 91s update-alternatives: using /usr/lib/s390x-linux-gnu/lapack/liblapack.so.3 to provide /usr/lib/s390x-linux-gnu/liblapack.so.3 (liblapack.so.3-s390x-linux-gnu) in auto mode 91s Setting up python3-pytest (8.3.5-2) ... 92s Setting up python3-aiohttp (3.11.16-1) ... 92s Setting up python3-dask (2024.12.1+dfsg-2) ... 93s Setting up python3-numpy (1:2.2.4+ds-1ubuntu1) ... 95s Setting up libjs-sphinxdoc (8.2.3-1ubuntu2) ... 95s Setting up python3-intake (0.6.6-4) ... 95s Setting up python3-msgpack-numpy (0.4.8-1) ... 95s Setting up python3-pandas-lib:s390x (2.3.3+dfsg-1ubuntu1) ... 95s Setting up python3-intake-doc (0.6.6-4) ... 95s Setting up python3-pandas (2.3.3+dfsg-1ubuntu1) ... 99s Processing triggers for man-db (2.13.1-1) ... 100s Processing triggers for libc-bin (2.42-2ubuntu2) ... 101s autopkgtest [16:43:21]: test run-unit-test: [----------------------- 102s ============================= test session starts ============================== 102s platform linux -- Python 3.13.9, pytest-8.3.5, pluggy-1.6.0 -- /usr/bin/python3.13 102s cachedir: .pytest_cache 102s rootdir: /tmp/autopkgtest.SS0vJB/build.nia/src 102s plugins: typeguard-4.4.2 103s collecting ... collected 424 items / 11 skipped 103s 103s intake/auth/tests/test_auth.py::test_get PASSED [ 0%] 103s intake/auth/tests/test_auth.py::test_base PASSED [ 0%] 103s intake/auth/tests/test_auth.py::test_base_client PASSED [ 0%] 103s intake/auth/tests/test_auth.py::test_base_get_case_insensitive PASSED [ 0%] 103s intake/auth/tests/test_auth.py::test_secret PASSED [ 1%] 103s intake/auth/tests/test_auth.py::test_secret_client PASSED [ 1%] 103s intake/catalog/tests/test_alias.py::test_simple PASSED [ 1%] 103s intake/catalog/tests/test_alias.py::test_mapping PASSED [ 1%] 106s intake/catalog/tests/test_auth_integration.py::test_secret_auth PASSED [ 2%] 109s intake/catalog/tests/test_auth_integration.py::test_secret_auth_fail PASSED [ 2%] 109s intake/catalog/tests/test_caching_integration.py::test_load_csv PASSED [ 2%] 109s intake/catalog/tests/test_caching_integration.py::test_list_of_files PASSED [ 2%] 109s intake/catalog/tests/test_caching_integration.py::test_bad_type_cache PASSED [ 3%] 109s intake/catalog/tests/test_caching_integration.py::test_load_textfile FAILED [ 3%] 109s intake/catalog/tests/test_caching_integration.py::test_load_arr PASSED [ 3%] 109s intake/catalog/tests/test_caching_integration.py::test_regex[test_no_regex] PASSED [ 3%] 109s intake/catalog/tests/test_caching_integration.py::test_regex[test_regex_no_match] PASSED [ 4%] 109s intake/catalog/tests/test_caching_integration.py::test_regex[test_regex_partial_match] PASSED [ 4%] 109s intake/catalog/tests/test_caching_integration.py::test_get_metadata PASSED [ 4%] 109s intake/catalog/tests/test_caching_integration.py::test_clear_cache PASSED [ 4%] 109s intake/catalog/tests/test_caching_integration.py::test_clear_cache_bad_metadata PASSED [ 4%] 109s intake/catalog/tests/test_caching_integration.py::test_clear_all PASSED [ 5%] 109s intake/catalog/tests/test_caching_integration.py::test_second_load PASSED [ 5%] 109s intake/catalog/tests/test_caching_integration.py::test_second_load_timestamp PASSED [ 5%] 109s intake/catalog/tests/test_caching_integration.py::test_second_load_refresh PASSED [ 5%] 110s intake/catalog/tests/test_caching_integration.py::test_multiple_cache PASSED [ 6%] 110s intake/catalog/tests/test_caching_integration.py::test_disable_caching PASSED [ 6%] 110s intake/catalog/tests/test_caching_integration.py::test_ds_set_cache_dir PASSED [ 6%] 110s intake/catalog/tests/test_catalog_save.py::test_catalog_description PASSED [ 6%] 110s intake/catalog/tests/test_core.py::test_no_entry PASSED [ 7%] 110s intake/catalog/tests/test_core.py::test_regression PASSED [ 7%] 110s intake/catalog/tests/test_default.py::test_load PASSED [ 7%] 110s intake/catalog/tests/test_discovery.py::test_catalog_discovery PASSED [ 7%] 110s intake/catalog/tests/test_discovery.py::test_deferred_import PASSED [ 8%] 110s intake/catalog/tests/test_gui.py::test_cat_no_panel_does_not_raise_errors PASSED [ 8%] 110s intake/catalog/tests/test_gui.py::test_cat_no_panel_display_gui PASSED [ 8%] 110s intake/catalog/tests/test_gui.py::test_cat_gui SKIPPED (could not im...) [ 8%] 110s intake/catalog/tests/test_gui.py::test_entry_no_panel_does_not_raise_errors PASSED [ 8%] 110s intake/catalog/tests/test_gui.py::test_entry_no_panel_display_gui PASSED [ 9%] 110s intake/catalog/tests/test_gui.py::test_entry_gui SKIPPED (could not ...) [ 9%] 110s intake/catalog/tests/test_local.py::test_local_catalog PASSED [ 9%] 110s intake/catalog/tests/test_local.py::test_get_items PASSED [ 9%] 110s intake/catalog/tests/test_local.py::test_nested FAILED [ 10%] 110s intake/catalog/tests/test_local.py::test_nested_gets_name_from_super PASSED [ 10%] 110s intake/catalog/tests/test_local.py::test_hash PASSED [ 10%] 110s intake/catalog/tests/test_local.py::test_getitem PASSED [ 10%] 110s intake/catalog/tests/test_local.py::test_source_plugin_config PASSED [ 11%] 110s intake/catalog/tests/test_local.py::test_metadata PASSED [ 11%] 110s intake/catalog/tests/test_local.py::test_use_source_plugin_from_config PASSED [ 11%] 110s intake/catalog/tests/test_local.py::test_get_dir PASSED [ 11%] 110s intake/catalog/tests/test_local.py::test_entry_dir_function PASSED [ 12%] 110s intake/catalog/tests/test_local.py::test_user_parameter_default_value[bool-False] PASSED [ 12%] 110s intake/catalog/tests/test_local.py::test_user_parameter_default_value[datetime-expected1] PASSED [ 12%] 110s intake/catalog/tests/test_local.py::test_user_parameter_default_value[float-0.0] PASSED [ 12%] 110s intake/catalog/tests/test_local.py::test_user_parameter_default_value[int-0] PASSED [ 12%] 110s intake/catalog/tests/test_local.py::test_user_parameter_default_value[list-expected4] PASSED [ 13%] 110s intake/catalog/tests/test_local.py::test_user_parameter_default_value[str-] PASSED [ 13%] 110s intake/catalog/tests/test_local.py::test_user_parameter_default_value[unicode-] PASSED [ 13%] 110s intake/catalog/tests/test_local.py::test_user_parameter_repr PASSED [ 13%] 110s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[bool-true-True] PASSED [ 14%] 110s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[bool-0-False] PASSED [ 14%] 110s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[datetime-given2-expected2] PASSED [ 14%] 110s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[datetime-2018-01-01 12:34AM-expected3] PASSED [ 14%] 110s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[datetime-1234567890000000000-expected4] PASSED [ 15%] 110s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[float-3.14-3.14] PASSED [ 15%] 110s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[int-1-1] PASSED [ 15%] 110s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[list-given7-expected7] PASSED [ 15%] 110s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[str-1-1] PASSED [ 16%] 110s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[unicode-foo-foo] PASSED [ 16%] 110s intake/catalog/tests/test_local.py::test_user_parameter_coerce_special_datetime[now] PASSED [ 16%] 110s intake/catalog/tests/test_local.py::test_user_parameter_coerce_special_datetime[today] PASSED [ 16%] 110s intake/catalog/tests/test_local.py::test_user_parameter_coerce_min[float-100.0-100.0] PASSED [ 16%] 110s intake/catalog/tests/test_local.py::test_user_parameter_coerce_min[int-20-20] PASSED [ 17%] 110s intake/catalog/tests/test_local.py::test_user_parameter_coerce_min[int-20.0-20] PASSED [ 17%] 110s intake/catalog/tests/test_local.py::test_user_parameter_coerce_max[float-100.0-100.0] PASSED [ 17%] 110s intake/catalog/tests/test_local.py::test_user_parameter_coerce_max[int-20-20] PASSED [ 17%] 110s intake/catalog/tests/test_local.py::test_user_parameter_coerce_max[int-20.0-20] PASSED [ 18%] 110s intake/catalog/tests/test_local.py::test_user_parameter_coerce_allowed[float-given0-expected0] PASSED [ 18%] 110s intake/catalog/tests/test_local.py::test_user_parameter_coerce_allowed[int-given1-expected1] PASSED [ 18%] 110s intake/catalog/tests/test_local.py::test_user_parameter_validation_range PASSED [ 18%] 110s intake/catalog/tests/test_local.py::test_user_parameter_validation_allowed PASSED [ 19%] 110s intake/catalog/tests/test_local.py::test_user_pars_list PASSED [ 19%] 110s intake/catalog/tests/test_local.py::test_user_pars_mlist PASSED [ 19%] 110s intake/catalog/tests/test_local.py::test_parser_validation_error[catalog_non_dict] PASSED [ 19%] 110s intake/catalog/tests/test_local.py::test_parser_validation_error[data_source_missing] PASSED [ 20%] 110s intake/catalog/tests/test_local.py::test_parser_validation_error[data_source_name_non_string] PASSED [ 20%] 110s intake/catalog/tests/test_local.py::test_parser_validation_error[data_source_non_dict] PASSED [ 20%] 110s intake/catalog/tests/test_local.py::test_parser_validation_error[data_source_value_non_dict] PASSED [ 20%] 110s intake/catalog/tests/test_local.py::test_parser_validation_error[params_missing_required] PASSED [ 20%] 110s intake/catalog/tests/test_local.py::test_parser_validation_error[params_name_non_string] PASSED [ 21%] 110s intake/catalog/tests/test_local.py::test_parser_validation_error[params_non_dict] PASSED [ 21%] 110s intake/catalog/tests/test_local.py::test_parser_validation_error[params_value_bad_choice] PASSED [ 21%] 110s intake/catalog/tests/test_local.py::test_parser_validation_error[params_value_bad_type] PASSED [ 21%] 110s intake/catalog/tests/test_local.py::test_parser_validation_error[params_value_non_dict] PASSED [ 22%] 110s intake/catalog/tests/test_local.py::test_parser_validation_error[plugins_non_dict] PASSED [ 22%] 110s intake/catalog/tests/test_local.py::test_parser_validation_error[plugins_source_missing] PASSED [ 22%] 110s intake/catalog/tests/test_local.py::test_parser_validation_error[plugins_source_missing_key] PASSED [ 22%] 110s intake/catalog/tests/test_local.py::test_parser_validation_error[plugins_source_non_dict] PASSED [ 23%] 110s intake/catalog/tests/test_local.py::test_parser_validation_error[plugins_source_non_list] PASSED [ 23%] 110s intake/catalog/tests/test_local.py::test_parser_obsolete_error[obsolete_data_source_list] PASSED [ 23%] 110s intake/catalog/tests/test_local.py::test_parser_obsolete_error[obsolete_params_list] PASSED [ 23%] 110s intake/catalog/tests/test_local.py::test_union_catalog PASSED [ 24%] 110s intake/catalog/tests/test_local.py::test_persist_local_cat PASSED [ 24%] 110s intake/catalog/tests/test_local.py::test_empty_catalog PASSED [ 24%] 110s intake/catalog/tests/test_local.py::test_nonexistent_error PASSED [ 24%] 110s intake/catalog/tests/test_local.py::test_duplicate_data_sources PASSED [ 25%] 110s intake/catalog/tests/test_local.py::test_duplicate_parameters PASSED [ 25%] 110s intake/catalog/tests/test_local.py::test_catalog_file_removal PASSED [ 25%] 110s intake/catalog/tests/test_local.py::test_flatten_duplicate_error PASSED [ 25%] 110s intake/catalog/tests/test_local.py::test_multi_cat_names PASSED [ 25%] 110s intake/catalog/tests/test_local.py::test_name_of_builtin PASSED [ 26%] 110s intake/catalog/tests/test_local.py::test_cat_with_declared_name PASSED [ 26%] 110s intake/catalog/tests/test_local.py::test_cat_with_no_declared_name_gets_name_from_dir_if_file_named_catalog PASSED [ 26%] 110s intake/catalog/tests/test_local.py::test_default_expansions PASSED [ 26%] 111s intake/catalog/tests/test_local.py::test_remote_cat PASSED [ 27%] 111s intake/catalog/tests/test_local.py::test_multi_plugins PASSED [ 27%] 111s intake/catalog/tests/test_local.py::test_no_plugins PASSED [ 27%] 111s intake/catalog/tests/test_local.py::test_explicit_entry_driver PASSED [ 27%] 111s intake/catalog/tests/test_local.py::test_getitem_and_getattr PASSED [ 28%] 111s intake/catalog/tests/test_local.py::test_dot_names PASSED [ 28%] 111s intake/catalog/tests/test_local.py::test_listing PASSED [ 28%] 111s intake/catalog/tests/test_local.py::test_dict_save PASSED [ 28%] 111s intake/catalog/tests/test_local.py::test_dict_save_complex PASSED [ 29%] 111s intake/catalog/tests/test_local.py::test_dict_adddel PASSED [ 29%] 111s intake/catalog/tests/test_local.py::test_filter PASSED [ 29%] 111s intake/catalog/tests/test_local.py::test_from_dict_with_data_source PASSED [ 29%] 111s intake/catalog/tests/test_local.py::test_no_instance PASSED [ 29%] 111s intake/catalog/tests/test_local.py::test_fsspec_integration PASSED [ 30%] 111s intake/catalog/tests/test_local.py::test_cat_add PASSED [ 30%] 111s intake/catalog/tests/test_local.py::test_no_entries_items PASSED [ 30%] 111s intake/catalog/tests/test_local.py::test_cat_dictlike PASSED [ 30%] 111s intake/catalog/tests/test_local.py::test_inherit_params SKIPPED (tes...) [ 31%] 111s intake/catalog/tests/test_local.py::test_runtime_overwrite_params SKIPPED [ 31%] 111s intake/catalog/tests/test_local.py::test_local_param_overwrites SKIPPED [ 31%] 111s intake/catalog/tests/test_local.py::test_local_and_global_params SKIPPED [ 31%] 111s intake/catalog/tests/test_local.py::test_search_inherit_params SKIPPED [ 32%] 111s intake/catalog/tests/test_local.py::test_multiple_cats_params SKIPPED [ 32%] 111s intake/catalog/tests/test_parameters.py::test_simplest PASSED [ 32%] 111s intake/catalog/tests/test_parameters.py::test_cache_default_source PASSED [ 32%] 111s intake/catalog/tests/test_parameters.py::test_parameter_default PASSED [ 33%] 111s intake/catalog/tests/test_parameters.py::test_maybe_default_from_env PASSED [ 33%] 111s intake/catalog/tests/test_parameters.py::test_up_override_and_render PASSED [ 33%] 111s intake/catalog/tests/test_parameters.py::test_user_explicit_override PASSED [ 33%] 111s intake/catalog/tests/test_parameters.py::test_auto_env_expansion PASSED [ 33%] 111s intake/catalog/tests/test_parameters.py::test_validate_up PASSED [ 34%] 111s intake/catalog/tests/test_parameters.py::test_validate_par PASSED [ 34%] 111s intake/catalog/tests/test_parameters.py::test_mlist_parameter PASSED [ 34%] 111s intake/catalog/tests/test_parameters.py::test_explicit_overrides PASSED [ 34%] 111s intake/catalog/tests/test_parameters.py::test_extra_arg PASSED [ 35%] 111s intake/catalog/tests/test_parameters.py::test_unknown PASSED [ 35%] 111s intake/catalog/tests/test_parameters.py::test_catalog_passthrough PASSED [ 35%] 111s intake/catalog/tests/test_persist.py::test_idempotent SKIPPED (could...) [ 35%] 111s intake/catalog/tests/test_persist.py::test_parquet SKIPPED (could no...) [ 36%] 113s intake/catalog/tests/test_reload_integration.py::test_reload_updated_config PASSED [ 36%] 115s intake/catalog/tests/test_reload_integration.py::test_reload_updated_directory PASSED [ 36%] 118s intake/catalog/tests/test_reload_integration.py::test_reload_missing_remote_directory PASSED [ 36%] 120s intake/catalog/tests/test_reload_integration.py::test_reload_missing_local_directory PASSED [ 37%] 121s intake/catalog/tests/test_remote_integration.py::test_info_describe FAILED [ 37%] 121s intake/catalog/tests/test_remote_integration.py::test_bad_url PASSED [ 37%] 121s intake/catalog/tests/test_remote_integration.py::test_metadata PASSED [ 37%] 121s intake/catalog/tests/test_remote_integration.py::test_nested_remote PASSED [ 37%] 121s intake/catalog/tests/test_remote_integration.py::test_remote_direct FAILED [ 38%] 121s intake/catalog/tests/test_remote_integration.py::test_entry_metadata PASSED [ 38%] 121s intake/catalog/tests/test_remote_integration.py::test_unknown_source PASSED [ 38%] 121s intake/catalog/tests/test_remote_integration.py::test_remote_datasource_interface FAILED [ 38%] 121s intake/catalog/tests/test_remote_integration.py::test_environment_evaluation PASSED [ 39%] 121s intake/catalog/tests/test_remote_integration.py::test_read FAILED [ 39%] 121s intake/catalog/tests/test_remote_integration.py::test_read_direct PASSED [ 39%] 121s intake/catalog/tests/test_remote_integration.py::test_read_chunks FAILED [ 39%] 121s intake/catalog/tests/test_remote_integration.py::test_read_partition FAILED [ 40%] 121s intake/catalog/tests/test_remote_integration.py::test_close FAILED [ 40%] 121s intake/catalog/tests/test_remote_integration.py::test_with FAILED [ 40%] 121s intake/catalog/tests/test_remote_integration.py::test_pickle FAILED [ 40%] 121s intake/catalog/tests/test_remote_integration.py::test_to_dask FAILED [ 41%] 121s intake/catalog/tests/test_remote_integration.py::test_remote_env PASSED [ 41%] 121s intake/catalog/tests/test_remote_integration.py::test_remote_sequence FAILED [ 41%] 121s intake/catalog/tests/test_remote_integration.py::test_remote_arr PASSED [ 41%] 121s intake/catalog/tests/test_remote_integration.py::test_pagination PASSED [ 41%] 121s intake/catalog/tests/test_remote_integration.py::test_dir FAILED [ 42%] 121s intake/catalog/tests/test_remote_integration.py::test_getitem_and_getattr PASSED [ 42%] 121s intake/catalog/tests/test_remote_integration.py::test_search PASSED [ 42%] 121s intake/catalog/tests/test_remote_integration.py::test_access_subcatalog PASSED [ 42%] 121s intake/catalog/tests/test_remote_integration.py::test_len PASSED [ 43%] 123s intake/catalog/tests/test_remote_integration.py::test_datetime PASSED [ 43%] 123s intake/catalog/tests/test_utils.py::test_expand_templates PASSED [ 43%] 123s intake/catalog/tests/test_utils.py::test_expand_nested_template PASSED [ 43%] 123s intake/catalog/tests/test_utils.py::test_coerce_datetime[None-expected0] PASSED [ 44%] 123s intake/catalog/tests/test_utils.py::test_coerce_datetime[1-expected1] PASSED [ 44%] 123s intake/catalog/tests/test_utils.py::test_coerce_datetime[1988-02-24T13:37+0100-expected2] PASSED [ 44%] 123s intake/catalog/tests/test_utils.py::test_coerce_datetime[test_input3-expected3] PASSED [ 44%] 123s intake/catalog/tests/test_utils.py::test_flatten PASSED [ 45%] 123s intake/catalog/tests/test_utils.py::test_coerce[1-int-1_0] PASSED [ 45%] 123s intake/catalog/tests/test_utils.py::test_coerce[1-int-1_1] PASSED [ 45%] 123s intake/catalog/tests/test_utils.py::test_coerce[1-str-1] PASSED [ 45%] 123s intake/catalog/tests/test_utils.py::test_coerce[value3-list-expected3] PASSED [ 45%] 123s intake/catalog/tests/test_utils.py::test_coerce[value4-list-expected4] PASSED [ 46%] 123s intake/catalog/tests/test_utils.py::test_coerce[value5-list[str]-expected5] PASSED [ 46%] 123s intake/cli/client/tests/test_cache.py::test_help PASSED [ 46%] 123s intake/cli/client/tests/test_cache.py::test_list_keys PASSED [ 46%] 124s intake/cli/client/tests/test_cache.py::test_precache PASSED [ 47%] 124s intake/cli/client/tests/test_cache.py::test_clear_all PASSED [ 47%] 125s intake/cli/client/tests/test_cache.py::test_clear_one PASSED [ 47%] 125s intake/cli/client/tests/test_cache.py::test_usage PASSED [ 47%] 125s intake/cli/client/tests/test_conf.py::test_reset PASSED [ 48%] 125s intake/cli/client/tests/test_conf.py::test_info PASSED [ 48%] 126s intake/cli/client/tests/test_conf.py::test_defaults PASSED [ 48%] 126s intake/cli/client/tests/test_conf.py::test_get PASSED [ 48%] 126s intake/cli/client/tests/test_conf.py::test_log_level PASSED [ 49%] 126s intake/cli/client/tests/test_local_integration.py::test_list PASSED [ 49%] 127s intake/cli/client/tests/test_local_integration.py::test_full_list PASSED [ 49%] 127s intake/cli/client/tests/test_local_integration.py::test_describe PASSED [ 49%] 127s intake/cli/client/tests/test_local_integration.py::test_exists_pass PASSED [ 50%] 128s intake/cli/client/tests/test_local_integration.py::test_exists_fail PASSED [ 50%] 129s intake/cli/client/tests/test_local_integration.py::test_discover FAILED [ 50%] 129s intake/cli/client/tests/test_local_integration.py::test_get_pass FAILED [ 50%] 130s intake/cli/client/tests/test_local_integration.py::test_get_fail PASSED [ 50%] 130s intake/cli/client/tests/test_local_integration.py::test_example PASSED [ 51%] 130s intake/cli/server/tests/test_serializer.py::test_dataframe[ser0] SKIPPED [ 51%] 130s intake/cli/server/tests/test_serializer.py::test_dataframe[ser1] SKIPPED [ 51%] 130s intake/cli/server/tests/test_serializer.py::test_dataframe[ser2] SKIPPED [ 51%] 130s intake/cli/server/tests/test_serializer.py::test_ndarray[ser0] PASSED [ 52%] 130s intake/cli/server/tests/test_serializer.py::test_ndarray[ser1] PASSED [ 52%] 130s intake/cli/server/tests/test_serializer.py::test_ndarray[ser2] PASSED [ 52%] 130s intake/cli/server/tests/test_serializer.py::test_python[ser0] PASSED [ 52%] 130s intake/cli/server/tests/test_serializer.py::test_python[ser1] PASSED [ 53%] 130s intake/cli/server/tests/test_serializer.py::test_python[ser2] PASSED [ 53%] 130s intake/cli/server/tests/test_serializer.py::test_compression_roundtrip[comp0] PASSED [ 53%] 130s intake/cli/server/tests/test_serializer.py::test_compression_roundtrip[comp1] PASSED [ 53%] 130s intake/cli/server/tests/test_serializer.py::test_none_compress PASSED [ 54%] 130s intake/cli/server/tests/test_server.py::TestServerV1Info::test_info PASSED [ 54%] 130s intake/cli/server/tests/test_server.py::TestServerV1Source::test_bad_action PASSED [ 54%] 130s intake/cli/server/tests/test_server.py::TestServerV1Source::test_idle_timer FAILED [ 54%] 130s intake/cli/server/tests/test_server.py::TestServerV1Source::test_no_format FAILED [ 54%] 130s intake/cli/server/tests/test_server.py::TestServerV1Source::test_open FAILED [ 55%] 130s intake/cli/server/tests/test_server.py::TestServerV1Source::test_open_direct PASSED [ 55%] 130s intake/cli/server/tests/test_server.py::TestServerV1Source::test_read_part_compressed SKIPPED [ 55%] 130s intake/cli/server/tests/test_server.py::TestServerV1Source::test_read_partition SKIPPED [ 55%] 131s intake/cli/server/tests/test_server.py::test_flatten_flag PASSED [ 56%] 131s intake/cli/server/tests/test_server.py::test_port_flag PASSED [ 56%] 131s intake/cli/tests/test_util.py::test_print_entry_info PASSED [ 56%] 131s intake/cli/tests/test_util.py::test_die PASSED [ 56%] 131s intake/cli/tests/test_util.py::Test_nice_join::test_default PASSED [ 57%] 131s intake/cli/tests/test_util.py::Test_nice_join::test_string_conjunction PASSED [ 57%] 131s intake/cli/tests/test_util.py::Test_nice_join::test_None_conjunction PASSED [ 57%] 131s intake/cli/tests/test_util.py::Test_nice_join::test_sep PASSED [ 57%] 131s intake/cli/tests/test_util.py::TestSubcommand::test_initialize_abstract PASSED [ 58%] 131s intake/cli/tests/test_util.py::TestSubcommand::test_invoke_abstract PASSED [ 58%] 131s intake/container/tests/test_generics.py::test_generic_dataframe PASSED [ 58%] 132s intake/container/tests/test_persist.py::test_store PASSED [ 58%] 132s intake/container/tests/test_persist.py::test_backtrack PASSED [ 58%] 132s intake/container/tests/test_persist.py::test_persist_with_nonnumeric_ttl_raises_error PASSED [ 59%] 132s intake/container/tests/test_persist.py::test_undask_persist SKIPPED [ 59%] 132s intake/interface/tests/test_init_gui.py::test_no_panel_does_not_raise_errors ERROR [ 59%] 132s intake/interface/tests/test_init_gui.py::test_no_panel_display_init_gui ERROR [ 59%] 132s intake/interface/tests/test_init_gui.py::test_display_init_gui ERROR [ 60%] 132s intake/source/tests/test_base.py::test_datasource_base_method_exceptions PASSED [ 60%] 132s intake/source/tests/test_base.py::test_name PASSED [ 60%] 132s intake/source/tests/test_base.py::test_datasource_base_context_manager PASSED [ 60%] 132s intake/source/tests/test_base.py::test_datasource_discover PASSED [ 61%] 132s intake/source/tests/test_base.py::test_datasource_read PASSED [ 61%] 132s intake/source/tests/test_base.py::test_datasource_read_chunked PASSED [ 61%] 132s intake/source/tests/test_base.py::test_datasource_read_partition PASSED [ 61%] 132s intake/source/tests/test_base.py::test_datasource_read_partition_out_of_range PASSED [ 62%] 132s intake/source/tests/test_base.py::test_datasource_to_dask PASSED [ 62%] 132s intake/source/tests/test_base.py::test_datasource_close PASSED [ 62%] 132s intake/source/tests/test_base.py::test_datasource_context_manager PASSED [ 62%] 132s intake/source/tests/test_base.py::test_datasource_pickle PASSED [ 62%] 132s intake/source/tests/test_base.py::test_datasource_python_discover PASSED [ 63%] 132s intake/source/tests/test_base.py::test_datasource_python_read PASSED [ 63%] 132s intake/source/tests/test_base.py::test_datasource_python_to_dask PASSED [ 63%] 132s intake/source/tests/test_base.py::test_yaml_method PASSED [ 63%] 132s intake/source/tests/test_base.py::test_alias_fail PASSED [ 64%] 132s intake/source/tests/test_base.py::test_reconfigure PASSED [ 64%] 132s intake/source/tests/test_base.py::test_import_name[data0] PASSED [ 64%] 132s intake/source/tests/test_base.py::test_import_name[data1] PASSED [ 64%] 132s intake/source/tests/test_base.py::test_import_name[data2] PASSED [ 65%] 132s intake/source/tests/test_base.py::test_import_name[data3] PASSED [ 65%] 132s intake/source/tests/test_base.py::test_import_name[data4] PASSED [ 65%] 132s intake/source/tests/test_cache.py::test_ensure_cache_dir PASSED [ 65%] 132s intake/source/tests/test_cache.py::test_munge_path PASSED [ 66%] 132s intake/source/tests/test_cache.py::test_hash PASSED [ 66%] 132s intake/source/tests/test_cache.py::test_path PASSED [ 66%] 132s intake/source/tests/test_cache.py::test_dir_cache PASSED [ 66%] 132s intake/source/tests/test_cache.py::test_compressed_cache PASSED [ 66%] 132s intake/source/tests/test_cache.py::test_filtered_compressed_cache PASSED [ 67%] 133s intake/source/tests/test_cache.py::test_cache_to_cat PASSED [ 67%] 133s intake/source/tests/test_cache.py::test_compressed_cache_infer PASSED [ 67%] 133s intake/source/tests/test_cache.py::test_compressions[tgz] PASSED [ 67%] 133s intake/source/tests/test_cache.py::test_compressions[tbz] PASSED [ 68%] 133s intake/source/tests/test_cache.py::test_compressions[tar] PASSED [ 68%] 133s intake/source/tests/test_cache.py::test_compressions[gz] PASSED [ 68%] 133s intake/source/tests/test_cache.py::test_compressions[bz] PASSED [ 68%] 133s intake/source/tests/test_cache.py::test_compressed_cache_bad PASSED [ 69%] 133s intake/source/tests/test_cache.py::test_dat SKIPPED (DAT not avaiable) [ 69%] 133s intake/source/tests/test_csv.py::test_csv_plugin PASSED [ 69%] 133s intake/source/tests/test_csv.py::test_open PASSED [ 69%] 133s intake/source/tests/test_csv.py::test_discover PASSED [ 70%] 133s intake/source/tests/test_csv.py::test_read PASSED [ 70%] 133s intake/source/tests/test_csv.py::test_read_list PASSED [ 70%] 133s intake/source/tests/test_csv.py::test_read_chunked PASSED [ 70%] 133s intake/source/tests/test_csv.py::test_read_pattern PASSED [ 70%] 133s intake/source/tests/test_csv.py::test_read_pattern_with_cache PASSED [ 71%] 133s intake/source/tests/test_csv.py::test_read_pattern_with_path_as_pattern_str PASSED [ 71%] 133s intake/source/tests/test_csv.py::test_read_partition PASSED [ 71%] 133s intake/source/tests/test_csv.py::test_to_dask PASSED [ 71%] 133s intake/source/tests/test_csv.py::test_plot SKIPPED (could not import...) [ 72%] 133s intake/source/tests/test_csv.py::test_close PASSED [ 72%] 133s intake/source/tests/test_csv.py::test_pickle PASSED [ 72%] 133s intake/source/tests/test_derived.py::test_columns PASSED [ 72%] 133s intake/source/tests/test_derived.py::test_df_transform PASSED [ 73%] 133s intake/source/tests/test_derived.py::test_barebones PASSED [ 73%] 133s intake/source/tests/test_derived.py::test_other_cat FAILED [ 73%] 133s intake/source/tests/test_discovery.py::test_package_scan PASSED [ 73%] 133s intake/source/tests/test_discovery.py::test_discover_cli PASSED [ 74%] 133s intake/source/tests/test_discovery.py::test_discover PASSED [ 74%] 133s intake/source/tests/test_discovery.py::test_enable_and_disable PASSED [ 74%] 133s intake/source/tests/test_discovery.py::test_discover_collision PASSED [ 74%] 133s intake/source/tests/test_json.py::test_jsonfile[None] PASSED [ 75%] 133s intake/source/tests/test_json.py::test_jsonfile[gzip] PASSED [ 75%] 133s intake/source/tests/test_json.py::test_jsonfile[bz2] PASSED [ 75%] 133s intake/source/tests/test_json.py::test_jsonfile_none[None] PASSED [ 75%] 133s intake/source/tests/test_json.py::test_jsonfile_none[gzip] PASSED [ 75%] 133s intake/source/tests/test_json.py::test_jsonfile_none[bz2] PASSED [ 76%] 133s intake/source/tests/test_json.py::test_jsonfile_discover[None] PASSED [ 76%] 133s intake/source/tests/test_json.py::test_jsonfile_discover[gzip] PASSED [ 76%] 133s intake/source/tests/test_json.py::test_jsonfile_discover[bz2] PASSED [ 76%] 133s intake/source/tests/test_json.py::test_jsonlfile[None] PASSED [ 77%] 133s intake/source/tests/test_json.py::test_jsonlfile[gzip] PASSED [ 77%] 133s intake/source/tests/test_json.py::test_jsonlfile[bz2] PASSED [ 77%] 133s intake/source/tests/test_json.py::test_jsonfilel_none[None] PASSED [ 77%] 133s intake/source/tests/test_json.py::test_jsonfilel_none[gzip] PASSED [ 78%] 133s intake/source/tests/test_json.py::test_jsonfilel_none[bz2] PASSED [ 78%] 133s intake/source/tests/test_json.py::test_jsonfilel_discover[None] PASSED [ 78%] 133s intake/source/tests/test_json.py::test_jsonfilel_discover[gzip] PASSED [ 78%] 133s intake/source/tests/test_json.py::test_jsonfilel_discover[bz2] PASSED [ 79%] 133s intake/source/tests/test_json.py::test_jsonl_head[None] PASSED [ 79%] 133s intake/source/tests/test_json.py::test_jsonl_head[gzip] PASSED [ 79%] 133s intake/source/tests/test_json.py::test_jsonl_head[bz2] PASSED [ 79%] 133s intake/source/tests/test_npy.py::test_one_file[shape0] PASSED [ 79%] 133s intake/source/tests/test_npy.py::test_one_file[shape1] PASSED [ 80%] 133s intake/source/tests/test_npy.py::test_one_file[shape2] PASSED [ 80%] 133s intake/source/tests/test_npy.py::test_one_file[shape3] PASSED [ 80%] 133s intake/source/tests/test_npy.py::test_one_file[shape4] PASSED [ 80%] 133s intake/source/tests/test_npy.py::test_multi_file[shape0] PASSED [ 81%] 133s intake/source/tests/test_npy.py::test_multi_file[shape1] PASSED [ 81%] 133s intake/source/tests/test_npy.py::test_multi_file[shape2] PASSED [ 81%] 133s intake/source/tests/test_npy.py::test_multi_file[shape3] PASSED [ 81%] 133s intake/source/tests/test_npy.py::test_multi_file[shape4] PASSED [ 82%] 133s intake/source/tests/test_npy.py::test_zarr_minimal SKIPPED (could no...) [ 82%] 133s intake/source/tests/test_text.py::test_textfiles PASSED [ 82%] 134s intake/source/tests/test_text.py::test_complex_text[None] PASSED [ 82%] 134s intake/source/tests/test_text.py::test_complex_text[gzip] PASSED [ 83%] 134s intake/source/tests/test_text.py::test_complex_text[bz2] PASSED [ 83%] 134s intake/source/tests/test_text.py::test_complex_bytes[pars0-None] PASSED [ 83%] 134s intake/source/tests/test_text.py::test_complex_bytes[pars0-gzip] PASSED [ 83%] 135s intake/source/tests/test_text.py::test_complex_bytes[pars0-bz2] PASSED [ 83%] 135s intake/source/tests/test_text.py::test_complex_bytes[pars1-None] PASSED [ 84%] 135s intake/source/tests/test_text.py::test_complex_bytes[pars1-gzip] PASSED [ 84%] 135s intake/source/tests/test_text.py::test_complex_bytes[pars1-bz2] PASSED [ 84%] 136s intake/source/tests/test_text.py::test_complex_bytes[pars2-None] PASSED [ 84%] 136s intake/source/tests/test_text.py::test_complex_bytes[pars2-gzip] PASSED [ 85%] 136s intake/source/tests/test_text.py::test_complex_bytes[pars2-bz2] PASSED [ 85%] 137s intake/source/tests/test_text.py::test_complex_bytes[pars3-None] PASSED [ 85%] 137s intake/source/tests/test_text.py::test_complex_bytes[pars3-gzip] PASSED [ 85%] 137s intake/source/tests/test_text.py::test_complex_bytes[pars3-bz2] PASSED [ 86%] 137s intake/source/tests/test_text.py::test_text_persist FAILED [ 86%] 137s intake/source/tests/test_text.py::test_text_export FAILED [ 86%] 137s intake/source/tests/test_utils.py::test_path_to_glob[data/LT05_L1TP_042033_{start_date:%Y%m%d}_{end_date:%Y%m%d}_01_T1_sr_band{band:1d}.tif-data/LT05_L1TP_042033_*_*_01_T1_sr_band*.tif] PASSED [ 86%] 137s intake/source/tests/test_utils.py::test_path_to_glob[data/LT05_L1TP_042033_*_*_01_T1_sr_band*.tif-data/LT05_L1TP_042033_*_*_01_T1_sr_band*.tif] PASSED [ 87%] 137s intake/source/tests/test_utils.py::test_path_to_glob[{year}/{month}/{day}.csv-*/*/*.csv] PASSED [ 87%] 137s intake/source/tests/test_utils.py::test_path_to_glob[data/**/*.csv-data/**/*.csv] PASSED [ 87%] 137s intake/source/tests/test_utils.py::test_path_to_glob[data/{year:4}{month:02}{day:02}.csv-data/*.csv] PASSED [ 87%] 137s intake/source/tests/test_utils.py::test_path_to_glob[{lone_param}-*] PASSED [ 87%] 137s intake/source/tests/test_utils.py::test_reverse_format[*.csv-apple.csv-expected0] PASSED [ 88%] 137s intake/source/tests/test_utils.py::test_reverse_format[{}.csv-apple.csv-expected1] PASSED [ 88%] 137s intake/source/tests/test_utils.py::test_reverse_format[{fruit}.{}-apple.csv-expected2] PASSED [ 88%] 137s intake/source/tests/test_utils.py::test_reverse_format[data//{fruit}.csv-data/apple.csv-expected3] PASSED [ 88%] 137s intake/source/tests/test_utils.py::test_reverse_format[data\\{fruit}.csv-C:\\data\\apple.csv-expected4] PASSED [ 89%] 137s intake/source/tests/test_utils.py::test_reverse_format[data/{fruit}.csv-C:\\data\\apple.csv-expected5] PASSED [ 89%] 137s intake/source/tests/test_utils.py::test_reverse_format[data/{fruit}.csv-data//apple.csv-expected6] PASSED [ 89%] 137s intake/source/tests/test_utils.py::test_reverse_format[{num:d}.csv-k.csv-expected7] PASSED [ 89%] 137s intake/source/tests/test_utils.py::test_reverse_format[{year:d}/{month:d}/{day:d}.csv-2016/2/01.csv-expected8] PASSED [ 90%] 137s intake/source/tests/test_utils.py::test_reverse_format[{year:.4}/{month:.2}/{day:.2}.csv-2016/2/01.csv-expected9] PASSED [ 90%] 137s intake/source/tests/test_utils.py::test_reverse_format[SRLCCTabularDat/Ecoregions_{emissions}_Precip_{model}.csv-/user/examples/SRLCCTabularDat/Ecoregions_a1b_Precip_ECHAM5-MPI.csv-expected10] PASSED [ 90%] 137s intake/source/tests/test_utils.py::test_reverse_format[data_{date:%Y_%m_%d}.csv-data_2016_10_01.csv-expected11] PASSED [ 90%] 137s intake/source/tests/test_utils.py::test_reverse_format[{state}{zip:5}-PA19104-expected12] PASSED [ 91%] 137s intake/source/tests/test_utils.py::test_reverse_format[{state}{zip:5d}.csv-PA19104.csv-expected13] PASSED [ 91%] 137s intake/source/tests/test_utils.py::test_reverse_format[{state:2}{zip:d}.csv-PA19104.csv-expected14] PASSED [ 91%] 137s intake/source/tests/test_utils.py::test_roundtripping_reverse_format[{date:%Y%m%d}-expected0] PASSED [ 91%] 137s intake/source/tests/test_utils.py::test_roundtripping_reverse_format[{num: .2f}-expected1] PASSED [ 91%] 137s intake/source/tests/test_utils.py::test_roundtripping_reverse_format[{percentage:.2%}-expected2] PASSED [ 92%] 137s intake/source/tests/test_utils.py::test_roundtripping_reverse_format[data/{year:4d}{month:02d}{day:02d}.csv-expected3] PASSED [ 92%] 137s intake/source/tests/test_utils.py::test_reverse_format_errors PASSED [ 92%] 137s intake/source/tests/test_utils.py::test_roundtrip_reverse_formats[data_{year}_{month}_{day}.csv] PASSED [ 92%] 137s intake/source/tests/test_utils.py::test_roundtrip_reverse_formats[data_{year:d}_{month:02d}_{day:02d}.csv] PASSED [ 93%] 137s intake/source/tests/test_utils.py::test_roundtrip_reverse_formats[data_{date:%Y_%m_%d}.csv] PASSED [ 93%] 137s intake/source/tests/test_utils.py::test_path_to_pattern[http://data/band{band:1d}.tif-metadata0-/band{band:1d}.tif] PASSED [ 93%] 137s intake/source/tests/test_utils.py::test_path_to_pattern[/data/band{band:1d}.tif-metadata1-/data/band{band:1d}.tif] PASSED [ 93%] 137s intake/source/tests/test_utils.py::test_path_to_pattern[/data/band{band:1d}.tif-None-/data/band{band:1d}.tif] PASSED [ 94%] 137s intake/tests/test_config.py::test_load_conf[conf0] PASSED [ 94%] 137s intake/tests/test_config.py::test_load_conf[conf1] PASSED [ 94%] 137s intake/tests/test_config.py::test_load_conf[conf2] PASSED [ 94%] 138s intake/tests/test_config.py::test_basic PASSED [ 95%] 139s intake/tests/test_config.py::test_cli PASSED [ 95%] 139s intake/tests/test_config.py::test_persist_modes PASSED [ 95%] 139s intake/tests/test_config.py::test_conf PASSED [ 95%] 140s intake/tests/test_config.py::test_conf_auth PASSED [ 95%] 140s intake/tests/test_config.py::test_pathdirs PASSED [ 96%] 140s intake/tests/test_top_level.py::test_autoregister_open PASSED [ 96%] 140s intake/tests/test_top_level.py::test_default_catalogs PASSED [ 96%] 140s intake/tests/test_top_level.py::test_user_catalog PASSED [ 96%] 140s intake/tests/test_top_level.py::test_open_styles PASSED [ 97%] 142s intake/tests/test_top_level.py::test_path_catalog PASSED [ 97%] 142s intake/tests/test_top_level.py::test_bad_open PASSED [ 97%] 142s intake/tests/test_top_level.py::test_output_notebook SKIPPED (could ...) [ 97%] 142s intake/tests/test_top_level.py::test_old_usage PASSED [ 98%] 142s intake/tests/test_top_level.py::test_no_imports PASSED [ 98%] 142s intake/tests/test_top_level.py::test_nested_catalog_access PASSED [ 98%] 142s intake/tests/test_utils.py::test_windows_file_path PASSED [ 98%] 142s intake/tests/test_utils.py::test_make_path_posix_removes_double_sep PASSED [ 99%] 142s intake/tests/test_utils.py::test_noops[~/fake.file] PASSED [ 99%] 142s intake/tests/test_utils.py::test_noops[https://example.com] PASSED [ 99%] 142s intake/tests/test_utils.py::test_roundtrip_file_path PASSED [ 99%] 142s intake/tests/test_utils.py::test_yaml_tuples PASSED [100%] 142s 142s ==================================== ERRORS ==================================== 142s ____________ ERROR at setup of test_no_panel_does_not_raise_errors _____________ 142s 142s attr = 'pytest_plugins' 142s 142s def __getattr__(attr): 142s if attr == 'instance': 142s do_import() 142s > return gl['instance'] 142s E KeyError: 'instance' 142s 142s intake/interface/__init__.py:39: KeyError 142s _______________ ERROR at setup of test_no_panel_display_init_gui _______________ 142s 142s attr = 'pytest_plugins' 142s 142s def __getattr__(attr): 142s if attr == 'instance': 142s do_import() 142s > return gl['instance'] 142s E KeyError: 'instance' 142s 142s intake/interface/__init__.py:39: KeyError 142s ___________________ ERROR at setup of test_display_init_gui ____________________ 142s 142s attr = 'pytest_plugins' 142s 142s def __getattr__(attr): 142s if attr == 'instance': 142s do_import() 142s > return gl['instance'] 142s E KeyError: 'instance' 142s 142s intake/interface/__init__.py:39: KeyError 142s =================================== FAILURES =================================== 142s ______________________________ test_load_textfile ______________________________ 142s 142s catalog_cache = 142s 142s def test_load_textfile(catalog_cache): 142s cat = catalog_cache['text_cache'] 142s cache = cat.cache[0] 142s 142s cache_paths = cache.load(cat._urlpath, output=False) 142s > cache_path = cache_paths[-1] 142s E TypeError: 'NoneType' object is not subscriptable 142s 142s intake/catalog/tests/test_caching_integration.py:53: TypeError 142s _________________________________ test_nested __________________________________ 142s 142s args = ('/tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv',) 142s kwargs = {'storage_options': None} 142s func = .read at 0x3ff80209080> 142s exc = OSError('An error occurred while calling the read_csv method registered to the pandas backend.\nOriginal Message: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files') 142s 142s @wraps(fn) 142s def wrapper(*args, **kwargs): 142s func = getattr(self, dispatch_name) 142s try: 142s > return func(*args, **kwargs) 142s 142s /usr/lib/python3/dist-packages/dask/backends.py:140: 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s /usr/lib/python3/dist-packages/dask/dataframe/io/csv.py:877: in read 142s return read_pandas( 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s 142s reader = 142s urlpath = '/tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv' 142s blocksize = 'default', lineterminator = '\n', compression = 'infer' 142s sample = 256000, sample_rows = 10, enforce = False, assume_missing = False 142s storage_options = None, include_path_column = False, kwargs = {} 142s reader_name = 'read_csv', b_lineterminator = b'\n', kw = 'chunksize' 142s lastskiprow = 0, firstrow = 0 142s 142s def read_pandas( 142s reader, 142s urlpath, 142s blocksize="default", 142s lineterminator=None, 142s compression="infer", 142s sample=256000, 142s sample_rows=10, 142s enforce=False, 142s assume_missing=False, 142s storage_options=None, 142s include_path_column=False, 142s **kwargs, 142s ): 142s reader_name = reader.__name__ 142s if lineterminator is not None and len(lineterminator) == 1: 142s kwargs["lineterminator"] = lineterminator 142s else: 142s lineterminator = "\n" 142s if "encoding" in kwargs: 142s b_lineterminator = lineterminator.encode(kwargs["encoding"]) 142s empty_blob = "".encode(kwargs["encoding"]) 142s if empty_blob: 142s # This encoding starts with a Byte Order Mark (BOM), so strip that from the 142s # start of the line terminator, since this value is not a full file. 142s b_lineterminator = b_lineterminator[len(empty_blob) :] 142s else: 142s b_lineterminator = lineterminator.encode() 142s if include_path_column and isinstance(include_path_column, bool): 142s include_path_column = "path" 142s if "index" in kwargs or ( 142s "index_col" in kwargs and kwargs.get("index_col") is not False 142s ): 142s raise ValueError( 142s "Keywords 'index' and 'index_col' not supported, except for " 142s "'index_col=False'. Use dd.{reader_name}(...).set_index('my-index') instead" 142s ) 142s for kw in ["iterator", "chunksize"]: 142s if kw in kwargs: 142s raise ValueError(f"{kw} not supported for dd.{reader_name}") 142s if kwargs.get("nrows", None): 142s raise ValueError( 142s "The 'nrows' keyword is not supported by " 142s "`dd.{0}`. To achieve the same behavior, it's " 142s "recommended to use `dd.{0}(...)." 142s "head(n=nrows)`".format(reader_name) 142s ) 142s if isinstance(kwargs.get("skiprows"), int): 142s lastskiprow = firstrow = kwargs.get("skiprows") 142s elif kwargs.get("skiprows") is None: 142s lastskiprow = firstrow = 0 142s else: 142s # When skiprows is a list, we expect more than max(skiprows) to 142s # be included in the sample. This means that [0,2] will work well, 142s # but [0, 440] might not work. 142s skiprows = set(kwargs.get("skiprows")) 142s lastskiprow = max(skiprows) 142s # find the firstrow that is not skipped, for use as header 142s firstrow = min(set(range(len(skiprows) + 1)) - set(skiprows)) 142s if isinstance(kwargs.get("header"), list): 142s raise TypeError(f"List of header rows not supported for dd.{reader_name}") 142s if isinstance(kwargs.get("converters"), dict) and include_path_column: 142s path_converter = kwargs.get("converters").get(include_path_column, None) 142s else: 142s path_converter = None 142s 142s # If compression is "infer", inspect the (first) path suffix and 142s # set the proper compression option if the suffix is recognized. 142s if compression == "infer": 142s # Translate the input urlpath to a simple path list 142s paths = get_fs_token_paths(urlpath, mode="rb", storage_options=storage_options)[ 142s 2 142s ] 142s 142s # Check for at least one valid path 142s if len(paths) == 0: 142s > raise OSError(f"{urlpath} resolved to no files") 142s E OSError: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files 142s 142s /usr/lib/python3/dist-packages/dask/dataframe/io/csv.py:644: OSError 142s 142s The above exception was the direct cause of the following exception: 142s 142s catalog1 = 142s 142s def test_nested(catalog1): 142s assert 'nested' in catalog1 142s assert 'entry1' in catalog1.nested.nested() 142s > assert catalog1.entry1.read().equals(catalog1.nested.nested.entry1.read()) 142s 142s intake/catalog/tests/test_local.py:86: 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s intake/source/csv.py:129: in read 142s self._get_schema() 142s intake/source/csv.py:115: in _get_schema 142s self._open_dataset(urlpath) 142s intake/source/csv.py:94: in _open_dataset 142s self._dataframe = dask.dataframe.read_csv( 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s 142s args = ('/tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv',) 142s kwargs = {'storage_options': None} 142s func = .read at 0x3ff80209080> 142s exc = OSError('An error occurred while calling the read_csv method registered to the pandas backend.\nOriginal Message: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files') 142s 142s @wraps(fn) 142s def wrapper(*args, **kwargs): 142s func = getattr(self, dispatch_name) 142s try: 142s return func(*args, **kwargs) 142s except Exception as e: 142s try: 142s exc = type(e)( 142s f"An error occurred while calling the {funcname(func)} " 142s f"method registered to the {self.backend} backend.\n" 142s f"Original Message: {e}" 142s ) 142s except TypeError: 142s raise e 142s else: 142s > raise exc from e 142s E OSError: An error occurred while calling the read_csv method registered to the pandas backend. 142s E Original Message: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files 142s 142s /usr/lib/python3/dist-packages/dask/backends.py:151: OSError 142s ______________________________ test_info_describe ______________________________ 142s 142s intake_server = 'intake://localhost:7483' 142s 142s def test_info_describe(intake_server): 142s catalog = open_catalog(intake_server) 142s 142s assert_items_equal(list(catalog), ['use_example1', 'nested', 'entry1', 142s 'entry1_part', 'remote_env', 142s 'local_env', 'text', 'arr', 'datetime']) 142s 142s > info = catalog['entry1'].describe() 142s 142s intake/catalog/tests/test_remote_integration.py:29: 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s intake/catalog/base.py:436: in __getitem__ 142s s = self._get_entry(key) 142s intake/catalog/utils.py:45: in wrapper 142s return f(self, *args, **kwargs) 142s intake/catalog/base.py:323: in _get_entry 142s return entry() 142s intake/catalog/entry.py:77: in __call__ 142s s = self.get(**kwargs) 142s intake/catalog/remote.py:459: in get 142s return open_remote( 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s 142s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 142s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 142s page_size = None, persist_mode = 'default' 142s auth = , getenv = True 142s getshell = True 142s 142s def open_remote(url, entry, container, user_parameters, description, http_args, 142s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 142s """Create either local direct data source or remote streamed source""" 142s from intake.container import container_map 142s import msgpack 142s import requests 142s from requests.compat import urljoin 142s 142s if url.startswith('intake://'): 142s url = url[len('intake://'):] 142s payload = dict(action='open', 142s name=entry, 142s parameters=user_parameters, 142s available_plugins=list(plugin_registry)) 142s req = requests.post(urljoin(url, 'v1/source'), 142s data=msgpack.packb(payload, **pack_kwargs), 142s **http_args) 142s if req.ok: 142s response = msgpack.unpackb(req.content, **unpack_kwargs) 142s 142s if 'plugin' in response: 142s pl = response['plugin'] 142s pl = [pl] if isinstance(pl, str) else pl 142s # Direct access 142s for p in pl: 142s if p in plugin_registry: 142s source = plugin_registry[p](**response['args']) 142s proxy = False 142s break 142s else: 142s proxy = True 142s else: 142s proxy = True 142s if proxy: 142s response.pop('container') 142s response.update({'name': entry, 'parameters': user_parameters}) 142s if container == 'catalog': 142s response.update({'auth': auth, 142s 'getenv': getenv, 142s 'getshell': getshell, 142s 'page_size': page_size, 142s 'persist_mode': persist_mode 142s # TODO ttl? 142s # TODO storage_options? 142s }) 142s source = container_map[container](url, http_args, **response) 142s source.description = description 142s return source 142s else: 142s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 142s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files 142s 142s intake/catalog/remote.py:519: Exception 142s ---------------------------- Captured stderr setup ----------------------------- 142s 2025-11-17 16:43:40,592 - intake - INFO - __main__.py:main:L53 - Creating catalog from: 142s 2025-11-17 16:43:40,592 - intake - INFO - __main__.py:main:L55 - - /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests/catalog1.yml 142s 2025-11-17 16:43:40,798 - intake - INFO - __main__.py:main:L62 - catalog_args: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests/catalog1.yml 142s 2025-11-17 16:43:40,798 - intake - INFO - __main__.py:main:L70 - Listening on localhost:7483 142s ----------------------------- Captured stderr call ----------------------------- 142s /usr/lib/python3/dist-packages/dask/dataframe/__init__.py:49: FutureWarning: 142s Dask dataframe query planning is disabled because dask-expr is not installed. 142s 142s You can install it with `pip install dask[dataframe]` or `conda install dask`. 142s This will raise in a future version. 142s 142s warnings.warn(msg, FutureWarning) 142s Traceback (most recent call last): 142s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 142s return func(*args, **kwargs) 142s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 142s return read_pandas( 142s reader, 142s ...<10 lines>... 142s **kwargs, 142s ) 142s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 142s raise OSError(f"{urlpath} resolved to no files") 142s OSError: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files 142s 142s The above exception was the direct cause of the following exception: 142s 142s Traceback (most recent call last): 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/cli/server/server.py", line 306, in post 142s source.discover() 142s ~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/base.py", line 347, in discover 142s self._load_metadata() 142s ~~~~~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/base.py", line 285, in _load_metadata 142s self._schema = self._get_schema() 142s ~~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/csv.py", line 115, in _get_schema 142s self._open_dataset(urlpath) 142s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/csv.py", line 94, in _open_dataset 142s self._dataframe = dask.dataframe.read_csv( 142s ~~~~~~~~~~~~~~~~~~~~~~~^ 142s urlpath, storage_options=self._storage_options, 142s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 142s **self._csv_kwargs) 142s ^^^^^^^^^^^^^^^^^^^ 142s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 142s raise exc from e 142s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 142s Original Message: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files 142s 400 POST /v1/source (::1): Discover failed 142s 400 POST /v1/source (::1) 124.11ms 142s ______________________________ test_remote_direct ______________________________ 142s 142s intake_server = 'intake://localhost:7483' 142s 142s def test_remote_direct(intake_server): 142s from intake.container.dataframe import RemoteDataFrame 142s catalog = open_catalog(intake_server) 142s > s0 = catalog.entry1() 142s 142s intake/catalog/tests/test_remote_integration.py:74: 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s intake/catalog/base.py:391: in __getattr__ 142s return self[item] # triggers reload_on_change 142s intake/catalog/base.py:436: in __getitem__ 142s s = self._get_entry(key) 142s intake/catalog/utils.py:45: in wrapper 142s return f(self, *args, **kwargs) 142s intake/catalog/base.py:323: in _get_entry 142s return entry() 142s intake/catalog/entry.py:77: in __call__ 142s s = self.get(**kwargs) 142s intake/catalog/remote.py:459: in get 142s return open_remote( 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s 142s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 142s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 142s page_size = None, persist_mode = 'default' 142s auth = , getenv = True 142s getshell = True 142s 142s def open_remote(url, entry, container, user_parameters, description, http_args, 142s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 142s """Create either local direct data source or remote streamed source""" 142s from intake.container import container_map 142s import msgpack 142s import requests 142s from requests.compat import urljoin 142s 142s if url.startswith('intake://'): 142s url = url[len('intake://'):] 142s payload = dict(action='open', 142s name=entry, 142s parameters=user_parameters, 142s available_plugins=list(plugin_registry)) 142s req = requests.post(urljoin(url, 'v1/source'), 142s data=msgpack.packb(payload, **pack_kwargs), 142s **http_args) 142s if req.ok: 142s response = msgpack.unpackb(req.content, **unpack_kwargs) 142s 142s if 'plugin' in response: 142s pl = response['plugin'] 142s pl = [pl] if isinstance(pl, str) else pl 142s # Direct access 142s for p in pl: 142s if p in plugin_registry: 142s source = plugin_registry[p](**response['args']) 142s proxy = False 142s break 142s else: 142s proxy = True 142s else: 142s proxy = True 142s if proxy: 142s response.pop('container') 142s response.update({'name': entry, 'parameters': user_parameters}) 142s if container == 'catalog': 142s response.update({'auth': auth, 142s 'getenv': getenv, 142s 'getshell': getshell, 142s 'page_size': page_size, 142s 'persist_mode': persist_mode 142s # TODO ttl? 142s # TODO storage_options? 142s }) 142s source = container_map[container](url, http_args, **response) 142s source.description = description 142s return source 142s else: 142s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 142s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files 142s 142s intake/catalog/remote.py:519: Exception 142s ----------------------------- Captured stderr call ----------------------------- 142s Traceback (most recent call last): 142s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 142s return func(*args, **kwargs) 142s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 142s return read_pandas( 142s reader, 142s ...<10 lines>... 142s **kwargs, 142s ) 142s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 142s raise OSError(f"{urlpath} resolved to no files") 142s OSError: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files 142s 142s The above exception was the direct cause of the following exception: 142s 142s Traceback (most recent call last): 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/cli/server/server.py", line 306, in post 142s source.discover() 142s ~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/base.py", line 347, in discover 142s self._load_metadata() 142s ~~~~~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/base.py", line 285, in _load_metadata 142s self._schema = self._get_schema() 142s ~~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/csv.py", line 115, in _get_schema 142s self._open_dataset(urlpath) 142s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/csv.py", line 94, in _open_dataset 142s self._dataframe = dask.dataframe.read_csv( 142s ~~~~~~~~~~~~~~~~~~~~~~~^ 142s urlpath, storage_options=self._storage_options, 142s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 142s **self._csv_kwargs) 142s ^^^^^^^^^^^^^^^^^^^ 142s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 142s raise exc from e 142s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 142s Original Message: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files 142s 400 POST /v1/source (::1): Discover failed 142s 400 POST /v1/source (::1) 2.45ms 142s _______________________ test_remote_datasource_interface _______________________ 142s 142s intake_server = 'intake://localhost:7483' 142s 142s def test_remote_datasource_interface(intake_server): 142s catalog = open_catalog(intake_server) 142s 142s > d = catalog['entry1'] 142s 142s intake/catalog/tests/test_remote_integration.py:101: 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s intake/catalog/base.py:436: in __getitem__ 142s s = self._get_entry(key) 142s intake/catalog/utils.py:45: in wrapper 142s return f(self, *args, **kwargs) 142s intake/catalog/base.py:323: in _get_entry 142s return entry() 142s intake/catalog/entry.py:77: in __call__ 142s s = self.get(**kwargs) 142s intake/catalog/remote.py:459: in get 142s return open_remote( 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s 142s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 142s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 142s page_size = None, persist_mode = 'default' 142s auth = , getenv = True 142s getshell = True 142s 142s def open_remote(url, entry, container, user_parameters, description, http_args, 142s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 142s """Create either local direct data source or remote streamed source""" 142s from intake.container import container_map 142s import msgpack 142s import requests 142s from requests.compat import urljoin 142s 142s if url.startswith('intake://'): 142s url = url[len('intake://'):] 142s payload = dict(action='open', 142s name=entry, 142s parameters=user_parameters, 142s available_plugins=list(plugin_registry)) 142s req = requests.post(urljoin(url, 'v1/source'), 142s data=msgpack.packb(payload, **pack_kwargs), 142s **http_args) 142s if req.ok: 142s response = msgpack.unpackb(req.content, **unpack_kwargs) 142s 142s if 'plugin' in response: 142s pl = response['plugin'] 142s pl = [pl] if isinstance(pl, str) else pl 142s # Direct access 142s for p in pl: 142s if p in plugin_registry: 142s source = plugin_registry[p](**response['args']) 142s proxy = False 142s break 142s else: 142s proxy = True 142s else: 142s proxy = True 142s if proxy: 142s response.pop('container') 142s response.update({'name': entry, 'parameters': user_parameters}) 142s if container == 'catalog': 142s response.update({'auth': auth, 142s 'getenv': getenv, 142s 'getshell': getshell, 142s 'page_size': page_size, 142s 'persist_mode': persist_mode 142s # TODO ttl? 142s # TODO storage_options? 142s }) 142s source = container_map[container](url, http_args, **response) 142s source.description = description 142s return source 142s else: 142s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 142s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files 142s 142s intake/catalog/remote.py:519: Exception 142s ----------------------------- Captured stderr call ----------------------------- 142s Traceback (most recent call last): 142s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 142s return func(*args, **kwargs) 142s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 142s return read_pandas( 142s reader, 142s ...<10 lines>... 142s **kwargs, 142s ) 142s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 142s raise OSError(f"{urlpath} resolved to no files") 142s OSError: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files 142s 142s The above exception was the direct cause of the following exception: 142s 142s Traceback (most recent call last): 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/cli/server/server.py", line 306, in post 142s source.discover() 142s ~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/base.py", line 347, in discover 142s self._load_metadata() 142s ~~~~~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/base.py", line 285, in _load_metadata 142s self._schema = self._get_schema() 142s ~~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/csv.py", line 115, in _get_schema 142s self._open_dataset(urlpath) 142s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/csv.py", line 94, in _open_dataset 142s self._dataframe = dask.dataframe.read_csv( 142s ~~~~~~~~~~~~~~~~~~~~~~~^ 142s urlpath, storage_options=self._storage_options, 142s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 142s **self._csv_kwargs) 142s ^^^^^^^^^^^^^^^^^^^ 142s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 142s raise exc from e 142s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 142s Original Message: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files 142s 400 POST /v1/source (::1): Discover failed 142s 400 POST /v1/source (::1) 2.41ms 142s __________________________________ test_read ___________________________________ 142s 142s intake_server = 'intake://localhost:7483' 142s 142s def test_read(intake_server): 142s catalog = open_catalog(intake_server) 142s 142s > d = catalog['entry1'] 142s 142s intake/catalog/tests/test_remote_integration.py:116: 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s intake/catalog/base.py:436: in __getitem__ 142s s = self._get_entry(key) 142s intake/catalog/utils.py:45: in wrapper 142s return f(self, *args, **kwargs) 142s intake/catalog/base.py:323: in _get_entry 142s return entry() 142s intake/catalog/entry.py:77: in __call__ 142s s = self.get(**kwargs) 142s intake/catalog/remote.py:459: in get 142s return open_remote( 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s 142s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 142s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 142s page_size = None, persist_mode = 'default' 142s auth = , getenv = True 142s getshell = True 142s 142s def open_remote(url, entry, container, user_parameters, description, http_args, 142s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 142s """Create either local direct data source or remote streamed source""" 142s from intake.container import container_map 142s import msgpack 142s import requests 142s from requests.compat import urljoin 142s 142s if url.startswith('intake://'): 142s url = url[len('intake://'):] 142s payload = dict(action='open', 142s name=entry, 142s parameters=user_parameters, 142s available_plugins=list(plugin_registry)) 142s req = requests.post(urljoin(url, 'v1/source'), 142s data=msgpack.packb(payload, **pack_kwargs), 142s **http_args) 142s if req.ok: 142s response = msgpack.unpackb(req.content, **unpack_kwargs) 142s 142s if 'plugin' in response: 142s pl = response['plugin'] 142s pl = [pl] if isinstance(pl, str) else pl 142s # Direct access 142s for p in pl: 142s if p in plugin_registry: 142s source = plugin_registry[p](**response['args']) 142s proxy = False 142s break 142s else: 142s proxy = True 142s else: 142s proxy = True 142s if proxy: 142s response.pop('container') 142s response.update({'name': entry, 'parameters': user_parameters}) 142s if container == 'catalog': 142s response.update({'auth': auth, 142s 'getenv': getenv, 142s 'getshell': getshell, 142s 'page_size': page_size, 142s 'persist_mode': persist_mode 142s # TODO ttl? 142s # TODO storage_options? 142s }) 142s source = container_map[container](url, http_args, **response) 142s source.description = description 142s return source 142s else: 142s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 142s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files 142s 142s intake/catalog/remote.py:519: Exception 142s ----------------------------- Captured stderr call ----------------------------- 142s Traceback (most recent call last): 142s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 142s return func(*args, **kwargs) 142s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 142s return read_pandas( 142s reader, 142s ...<10 lines>... 142s **kwargs, 142s ) 142s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 142s raise OSError(f"{urlpath} resolved to no files") 142s OSError: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files 142s 142s The above exception was the direct cause of the following exception: 142s 142s Traceback (most recent call last): 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/cli/server/server.py", line 306, in post 142s source.discover() 142s ~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/base.py", line 347, in discover 142s self._load_metadata() 142s ~~~~~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/base.py", line 285, in _load_metadata 142s self._schema = self._get_schema() 142s ~~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/csv.py", line 115, in _get_schema 142s self._open_dataset(urlpath) 142s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/csv.py", line 94, in _open_dataset 142s self._dataframe = dask.dataframe.read_csv( 142s ~~~~~~~~~~~~~~~~~~~~~~~^ 142s urlpath, storage_options=self._storage_options, 142s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 142s **self._csv_kwargs) 142s ^^^^^^^^^^^^^^^^^^^ 142s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 142s raise exc from e 142s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 142s Original Message: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files 142s 400 POST /v1/source (::1): Discover failed 142s 400 POST /v1/source (::1) 2.28ms 142s _______________________________ test_read_chunks _______________________________ 142s 142s intake_server = 'intake://localhost:7483' 142s 142s def test_read_chunks(intake_server): 142s catalog = open_catalog(intake_server) 142s 142s > d = catalog.entry1 142s 142s intake/catalog/tests/test_remote_integration.py:170: 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s intake/catalog/base.py:391: in __getattr__ 142s return self[item] # triggers reload_on_change 142s intake/catalog/base.py:436: in __getitem__ 142s s = self._get_entry(key) 142s intake/catalog/utils.py:45: in wrapper 142s return f(self, *args, **kwargs) 142s intake/catalog/base.py:323: in _get_entry 142s return entry() 142s intake/catalog/entry.py:77: in __call__ 142s s = self.get(**kwargs) 142s intake/catalog/remote.py:459: in get 142s return open_remote( 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s 142s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 142s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 142s page_size = None, persist_mode = 'default' 142s auth = , getenv = True 142s getshell = True 142s 142s def open_remote(url, entry, container, user_parameters, description, http_args, 142s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 142s """Create either local direct data source or remote streamed source""" 142s from intake.container import container_map 142s import msgpack 142s import requests 142s from requests.compat import urljoin 142s 142s if url.startswith('intake://'): 142s url = url[len('intake://'):] 142s payload = dict(action='open', 142s name=entry, 142s parameters=user_parameters, 142s available_plugins=list(plugin_registry)) 142s req = requests.post(urljoin(url, 'v1/source'), 142s data=msgpack.packb(payload, **pack_kwargs), 142s **http_args) 142s if req.ok: 142s response = msgpack.unpackb(req.content, **unpack_kwargs) 142s 142s if 'plugin' in response: 142s pl = response['plugin'] 142s pl = [pl] if isinstance(pl, str) else pl 142s # Direct access 142s for p in pl: 142s if p in plugin_registry: 142s source = plugin_registry[p](**response['args']) 142s proxy = False 142s break 142s else: 142s proxy = True 142s else: 142s proxy = True 142s if proxy: 142s response.pop('container') 142s response.update({'name': entry, 'parameters': user_parameters}) 142s if container == 'catalog': 142s response.update({'auth': auth, 142s 'getenv': getenv, 142s 'getshell': getshell, 142s 'page_size': page_size, 142s 'persist_mode': persist_mode 142s # TODO ttl? 142s # TODO storage_options? 142s }) 142s source = container_map[container](url, http_args, **response) 142s source.description = description 142s return source 142s else: 142s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 142s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files 142s 142s intake/catalog/remote.py:519: Exception 142s ----------------------------- Captured stderr call ----------------------------- 142s Traceback (most recent call last): 142s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 142s return func(*args, **kwargs) 142s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 142s return read_pandas( 142s reader, 142s ...<10 lines>... 142s **kwargs, 142s ) 142s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 142s raise OSError(f"{urlpath} resolved to no files") 142s OSError: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files 142s 142s The above exception was the direct cause of the following exception: 142s 142s Traceback (most recent call last): 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/cli/server/server.py", line 306, in post 142s source.discover() 142s ~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/base.py", line 347, in discover 142s self._load_metadata() 142s ~~~~~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/base.py", line 285, in _load_metadata 142s self._schema = self._get_schema() 142s ~~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/csv.py", line 115, in _get_schema 142s self._open_dataset(urlpath) 142s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/csv.py", line 94, in _open_dataset 142s self._dataframe = dask.dataframe.read_csv( 142s ~~~~~~~~~~~~~~~~~~~~~~~^ 142s urlpath, storage_options=self._storage_options, 142s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 142s **self._csv_kwargs) 142s ^^^^^^^^^^^^^^^^^^^ 142s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 142s raise exc from e 142s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 142s Original Message: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files 142s 400 POST /v1/source (::1): Discover failed 142s 400 POST /v1/source (::1) 2.33ms 142s _____________________________ test_read_partition ______________________________ 142s 142s intake_server = 'intake://localhost:7483' 142s 142s def test_read_partition(intake_server): 142s catalog = open_catalog(intake_server) 142s 142s > d = catalog.entry1 142s 142s intake/catalog/tests/test_remote_integration.py:186: 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s intake/catalog/base.py:391: in __getattr__ 142s return self[item] # triggers reload_on_change 142s intake/catalog/base.py:436: in __getitem__ 142s s = self._get_entry(key) 142s intake/catalog/utils.py:45: in wrapper 142s return f(self, *args, **kwargs) 142s intake/catalog/base.py:323: in _get_entry 142s return entry() 142s intake/catalog/entry.py:77: in __call__ 142s s = self.get(**kwargs) 142s intake/catalog/remote.py:459: in get 142s return open_remote( 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s 142s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 142s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 142s page_size = None, persist_mode = 'default' 142s auth = , getenv = True 142s getshell = True 142s 142s def open_remote(url, entry, container, user_parameters, description, http_args, 142s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 142s """Create either local direct data source or remote streamed source""" 142s from intake.container import container_map 142s import msgpack 142s import requests 142s from requests.compat import urljoin 142s 142s if url.startswith('intake://'): 142s url = url[len('intake://'):] 142s payload = dict(action='open', 142s name=entry, 142s parameters=user_parameters, 142s available_plugins=list(plugin_registry)) 142s req = requests.post(urljoin(url, 'v1/source'), 142s data=msgpack.packb(payload, **pack_kwargs), 142s **http_args) 142s if req.ok: 142s response = msgpack.unpackb(req.content, **unpack_kwargs) 142s 142s if 'plugin' in response: 142s pl = response['plugin'] 142s pl = [pl] if isinstance(pl, str) else pl 142s # Direct access 142s for p in pl: 142s if p in plugin_registry: 142s source = plugin_registry[p](**response['args']) 142s proxy = False 142s break 142s else: 142s proxy = True 142s else: 142s proxy = True 142s if proxy: 142s response.pop('container') 142s response.update({'name': entry, 'parameters': user_parameters}) 142s if container == 'catalog': 142s response.update({'auth': auth, 142s 'getenv': getenv, 142s 'getshell': getshell, 142s 'page_size': page_size, 142s 'persist_mode': persist_mode 142s # TODO ttl? 142s # TODO storage_options? 142s }) 142s source = container_map[container](url, http_args, **response) 142s source.description = description 142s return source 142s else: 142s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 142s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files 142s 142s intake/catalog/remote.py:519: Exception 142s ----------------------------- Captured stderr call ----------------------------- 142s Traceback (most recent call last): 142s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 142s return func(*args, **kwargs) 142s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 142s return read_pandas( 142s reader, 142s ...<10 lines>... 142s **kwargs, 142s ) 142s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 142s raise OSError(f"{urlpath} resolved to no files") 142s OSError: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files 142s 142s The above exception was the direct cause of the following exception: 142s 142s Traceback (most recent call last): 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/cli/server/server.py", line 306, in post 142s source.discover() 142s ~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/base.py", line 347, in discover 142s self._load_metadata() 142s ~~~~~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/base.py", line 285, in _load_metadata 142s self._schema = self._get_schema() 142s ~~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/csv.py", line 115, in _get_schema 142s self._open_dataset(urlpath) 142s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/csv.py", line 94, in _open_dataset 142s self._dataframe = dask.dataframe.read_csv( 142s ~~~~~~~~~~~~~~~~~~~~~~~^ 142s urlpath, storage_options=self._storage_options, 142s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 142s **self._csv_kwargs) 142s ^^^^^^^^^^^^^^^^^^^ 142s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 142s raise exc from e 142s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 142s Original Message: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files 142s 400 POST /v1/source (::1): Discover failed 142s 400 POST /v1/source (::1) 2.20ms 142s __________________________________ test_close __________________________________ 142s 142s intake_server = 'intake://localhost:7483' 142s 142s def test_close(intake_server): 142s catalog = open_catalog(intake_server) 142s 142s > d = catalog.entry1 142s 142s intake/catalog/tests/test_remote_integration.py:201: 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s intake/catalog/base.py:391: in __getattr__ 142s return self[item] # triggers reload_on_change 142s intake/catalog/base.py:436: in __getitem__ 142s s = self._get_entry(key) 142s intake/catalog/utils.py:45: in wrapper 142s return f(self, *args, **kwargs) 142s intake/catalog/base.py:323: in _get_entry 142s return entry() 142s intake/catalog/entry.py:77: in __call__ 142s s = self.get(**kwargs) 142s intake/catalog/remote.py:459: in get 142s return open_remote( 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s 142s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 142s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 142s page_size = None, persist_mode = 'default' 142s auth = , getenv = True 142s getshell = True 142s 142s def open_remote(url, entry, container, user_parameters, description, http_args, 142s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 142s """Create either local direct data source or remote streamed source""" 142s from intake.container import container_map 142s import msgpack 142s import requests 142s from requests.compat import urljoin 142s 142s if url.startswith('intake://'): 142s url = url[len('intake://'):] 142s payload = dict(action='open', 142s name=entry, 142s parameters=user_parameters, 142s available_plugins=list(plugin_registry)) 142s req = requests.post(urljoin(url, 'v1/source'), 142s data=msgpack.packb(payload, **pack_kwargs), 142s **http_args) 142s if req.ok: 142s response = msgpack.unpackb(req.content, **unpack_kwargs) 142s 142s if 'plugin' in response: 142s pl = response['plugin'] 142s pl = [pl] if isinstance(pl, str) else pl 142s # Direct access 142s for p in pl: 142s if p in plugin_registry: 142s source = plugin_registry[p](**response['args']) 142s proxy = False 142s break 142s else: 142s proxy = True 142s else: 142s proxy = True 142s if proxy: 142s response.pop('container') 142s response.update({'name': entry, 'parameters': user_parameters}) 142s if container == 'catalog': 142s response.update({'auth': auth, 142s 'getenv': getenv, 142s 'getshell': getshell, 142s 'page_size': page_size, 142s 'persist_mode': persist_mode 142s # TODO ttl? 142s # TODO storage_options? 142s }) 142s source = container_map[container](url, http_args, **response) 142s source.description = description 142s return source 142s else: 142s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 142s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files 142s 142s intake/catalog/remote.py:519: Exception 142s ----------------------------- Captured stderr call ----------------------------- 142s Traceback (most recent call last): 142s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 142s return func(*args, **kwargs) 142s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 142s return read_pandas( 142s reader, 142s ...<10 lines>... 142s **kwargs, 142s ) 142s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 142s raise OSError(f"{urlpath} resolved to no files") 142s OSError: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files 142s 142s The above exception was the direct cause of the following exception: 142s 142s Traceback (most recent call last): 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/cli/server/server.py", line 306, in post 142s source.discover() 142s ~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/base.py", line 347, in discover 142s self._load_metadata() 142s ~~~~~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/base.py", line 285, in _load_metadata 142s self._schema = self._get_schema() 142s ~~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/csv.py", line 115, in _get_schema 142s self._open_dataset(urlpath) 142s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/csv.py", line 94, in _open_dataset 142s self._dataframe = dask.dataframe.read_csv( 142s ~~~~~~~~~~~~~~~~~~~~~~~^ 142s urlpath, storage_options=self._storage_options, 142s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 142s **self._csv_kwargs) 142s ^^^^^^^^^^^^^^^^^^^ 142s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 142s raise exc from e 142s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 142s Original Message: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files 142s 400 POST /v1/source (::1): Discover failed 142s 400 POST /v1/source (::1) 2.11ms 142s __________________________________ test_with ___________________________________ 142s 142s intake_server = 'intake://localhost:7483' 142s 142s def test_with(intake_server): 142s catalog = open_catalog(intake_server) 142s 142s > with catalog.entry1 as f: 142s 142s intake/catalog/tests/test_remote_integration.py:208: 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s intake/catalog/base.py:391: in __getattr__ 142s return self[item] # triggers reload_on_change 142s intake/catalog/base.py:436: in __getitem__ 142s s = self._get_entry(key) 142s intake/catalog/utils.py:45: in wrapper 142s return f(self, *args, **kwargs) 142s intake/catalog/base.py:323: in _get_entry 142s return entry() 142s intake/catalog/entry.py:77: in __call__ 142s s = self.get(**kwargs) 142s intake/catalog/remote.py:459: in get 142s return open_remote( 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s 142s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 142s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 142s page_size = None, persist_mode = 'default' 142s auth = , getenv = True 142s getshell = True 142s 142s def open_remote(url, entry, container, user_parameters, description, http_args, 142s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 142s """Create either local direct data source or remote streamed source""" 142s from intake.container import container_map 142s import msgpack 142s import requests 142s from requests.compat import urljoin 142s 142s if url.startswith('intake://'): 142s url = url[len('intake://'):] 142s payload = dict(action='open', 142s name=entry, 142s parameters=user_parameters, 142s available_plugins=list(plugin_registry)) 142s req = requests.post(urljoin(url, 'v1/source'), 142s data=msgpack.packb(payload, **pack_kwargs), 142s **http_args) 142s if req.ok: 142s response = msgpack.unpackb(req.content, **unpack_kwargs) 142s 142s if 'plugin' in response: 142s pl = response['plugin'] 142s pl = [pl] if isinstance(pl, str) else pl 142s # Direct access 142s for p in pl: 142s if p in plugin_registry: 142s source = plugin_registry[p](**response['args']) 142s proxy = False 142s break 142s else: 142s proxy = True 142s else: 142s proxy = True 142s if proxy: 142s response.pop('container') 142s response.update({'name': entry, 'parameters': user_parameters}) 142s if container == 'catalog': 142s response.update({'auth': auth, 142s 'getenv': getenv, 142s 'getshell': getshell, 142s 'page_size': page_size, 142s 'persist_mode': persist_mode 142s # TODO ttl? 142s # TODO storage_options? 142s }) 142s source = container_map[container](url, http_args, **response) 142s source.description = description 142s return source 142s else: 142s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 142s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files 142s 142s intake/catalog/remote.py:519: Exception 142s ----------------------------- Captured stderr call ----------------------------- 142s Traceback (most recent call last): 142s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 142s return func(*args, **kwargs) 142s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 142s return read_pandas( 142s reader, 142s ...<10 lines>... 142s **kwargs, 142s ) 142s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 142s raise OSError(f"{urlpath} resolved to no files") 142s OSError: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files 142s 142s The above exception was the direct cause of the following exception: 142s 142s Traceback (most recent call last): 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/cli/server/server.py", line 306, in post 142s source.discover() 142s ~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/base.py", line 347, in discover 142s self._load_metadata() 142s ~~~~~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/base.py", line 285, in _load_metadata 142s self._schema = self._get_schema() 142s ~~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/csv.py", line 115, in _get_schema 142s self._open_dataset(urlpath) 142s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/csv.py", line 94, in _open_dataset 142s self._dataframe = dask.dataframe.read_csv( 142s ~~~~~~~~~~~~~~~~~~~~~~~^ 142s urlpath, storage_options=self._storage_options, 142s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 142s **self._csv_kwargs) 142s ^^^^^^^^^^^^^^^^^^^ 142s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 142s raise exc from e 142s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 142s Original Message: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files 142s 400 POST /v1/source (::1): Discover failed 142s 400 POST /v1/source (::1) 2.07ms 142s _________________________________ test_pickle __________________________________ 142s 142s intake_server = 'intake://localhost:7483' 142s 142s def test_pickle(intake_server): 142s catalog = open_catalog(intake_server) 142s 142s > d = catalog.entry1 142s 142s intake/catalog/tests/test_remote_integration.py:215: 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s intake/catalog/base.py:391: in __getattr__ 142s return self[item] # triggers reload_on_change 142s intake/catalog/base.py:436: in __getitem__ 142s s = self._get_entry(key) 142s intake/catalog/utils.py:45: in wrapper 142s return f(self, *args, **kwargs) 142s intake/catalog/base.py:323: in _get_entry 142s return entry() 142s intake/catalog/entry.py:77: in __call__ 142s s = self.get(**kwargs) 142s intake/catalog/remote.py:459: in get 142s return open_remote( 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s 142s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 142s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 142s page_size = None, persist_mode = 'default' 142s auth = , getenv = True 142s getshell = True 142s 142s def open_remote(url, entry, container, user_parameters, description, http_args, 142s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 142s """Create either local direct data source or remote streamed source""" 142s from intake.container import container_map 142s import msgpack 142s import requests 142s from requests.compat import urljoin 142s 142s if url.startswith('intake://'): 142s url = url[len('intake://'):] 142s payload = dict(action='open', 142s name=entry, 142s parameters=user_parameters, 142s available_plugins=list(plugin_registry)) 142s req = requests.post(urljoin(url, 'v1/source'), 142s data=msgpack.packb(payload, **pack_kwargs), 142s **http_args) 142s if req.ok: 142s response = msgpack.unpackb(req.content, **unpack_kwargs) 142s 142s if 'plugin' in response: 142s pl = response['plugin'] 142s pl = [pl] if isinstance(pl, str) else pl 142s # Direct access 142s for p in pl: 142s if p in plugin_registry: 142s source = plugin_registry[p](**response['args']) 142s proxy = False 142s break 142s else: 142s proxy = True 142s else: 142s proxy = True 142s if proxy: 142s response.pop('container') 142s response.update({'name': entry, 'parameters': user_parameters}) 142s if container == 'catalog': 142s response.update({'auth': auth, 142s 'getenv': getenv, 142s 'getshell': getshell, 142s 'page_size': page_size, 142s 'persist_mode': persist_mode 142s # TODO ttl? 142s # TODO storage_options? 142s }) 142s source = container_map[container](url, http_args, **response) 142s source.description = description 142s return source 142s else: 142s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 142s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files 142s 142s intake/catalog/remote.py:519: Exception 142s ----------------------------- Captured stderr call ----------------------------- 142s Traceback (most recent call last): 142s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 142s return func(*args, **kwargs) 142s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 142s return read_pandas( 142s reader, 142s ...<10 lines>... 142s **kwargs, 142s ) 142s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 142s raise OSError(f"{urlpath} resolved to no files") 142s OSError: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files 142s 142s The above exception was the direct cause of the following exception: 142s 142s Traceback (most recent call last): 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/cli/server/server.py", line 306, in post 142s source.discover() 142s ~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/base.py", line 347, in discover 142s self._load_metadata() 142s ~~~~~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/base.py", line 285, in _load_metadata 142s self._schema = self._get_schema() 142s ~~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/csv.py", line 115, in _get_schema 142s self._open_dataset(urlpath) 142s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/csv.py", line 94, in _open_dataset 142s self._dataframe = dask.dataframe.read_csv( 142s ~~~~~~~~~~~~~~~~~~~~~~~^ 142s urlpath, storage_options=self._storage_options, 142s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 142s **self._csv_kwargs) 142s ^^^^^^^^^^^^^^^^^^^ 142s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 142s raise exc from e 142s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 142s Original Message: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files 142s 400 POST /v1/source (::1): Discover failed 142s 400 POST /v1/source (::1) 2.43ms 142s _________________________________ test_to_dask _________________________________ 142s 142s intake_server = 'intake://localhost:7483' 142s 142s def test_to_dask(intake_server): 142s catalog = open_catalog(intake_server) 142s > d = catalog.entry1 142s 142s intake/catalog/tests/test_remote_integration.py:231: 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s intake/catalog/base.py:391: in __getattr__ 142s return self[item] # triggers reload_on_change 142s intake/catalog/base.py:436: in __getitem__ 142s s = self._get_entry(key) 142s intake/catalog/utils.py:45: in wrapper 142s return f(self, *args, **kwargs) 142s intake/catalog/base.py:323: in _get_entry 142s return entry() 142s intake/catalog/entry.py:77: in __call__ 142s s = self.get(**kwargs) 142s intake/catalog/remote.py:459: in get 142s return open_remote( 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s 142s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 142s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 142s page_size = None, persist_mode = 'default' 142s auth = , getenv = True 142s getshell = True 142s 142s def open_remote(url, entry, container, user_parameters, description, http_args, 142s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 142s """Create either local direct data source or remote streamed source""" 142s from intake.container import container_map 142s import msgpack 142s import requests 142s from requests.compat import urljoin 142s 142s if url.startswith('intake://'): 142s url = url[len('intake://'):] 142s payload = dict(action='open', 142s name=entry, 142s parameters=user_parameters, 142s available_plugins=list(plugin_registry)) 142s req = requests.post(urljoin(url, 'v1/source'), 142s data=msgpack.packb(payload, **pack_kwargs), 142s **http_args) 142s if req.ok: 142s response = msgpack.unpackb(req.content, **unpack_kwargs) 142s 142s if 'plugin' in response: 142s pl = response['plugin'] 142s pl = [pl] if isinstance(pl, str) else pl 142s # Direct access 142s for p in pl: 142s if p in plugin_registry: 142s source = plugin_registry[p](**response['args']) 142s proxy = False 142s break 142s else: 142s proxy = True 142s else: 142s proxy = True 142s if proxy: 142s response.pop('container') 142s response.update({'name': entry, 'parameters': user_parameters}) 142s if container == 'catalog': 142s response.update({'auth': auth, 142s 'getenv': getenv, 142s 'getshell': getshell, 142s 'page_size': page_size, 142s 'persist_mode': persist_mode 142s # TODO ttl? 142s # TODO storage_options? 142s }) 142s source = container_map[container](url, http_args, **response) 142s source.description = description 142s return source 142s else: 142s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 142s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files 142s 142s intake/catalog/remote.py:519: Exception 142s ----------------------------- Captured stderr call ----------------------------- 142s Traceback (most recent call last): 142s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 142s return func(*args, **kwargs) 142s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 142s return read_pandas( 142s reader, 142s ...<10 lines>... 142s **kwargs, 142s ) 142s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 142s raise OSError(f"{urlpath} resolved to no files") 142s OSError: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files 142s 142s The above exception was the direct cause of the following exception: 142s 142s Traceback (most recent call last): 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/cli/server/server.py", line 306, in post 142s source.discover() 142s ~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/base.py", line 347, in discover 142s self._load_metadata() 142s ~~~~~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/base.py", line 285, in _load_metadata 142s self._schema = self._get_schema() 142s ~~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/csv.py", line 115, in _get_schema 142s self._open_dataset(urlpath) 142s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/csv.py", line 94, in _open_dataset 142s self._dataframe = dask.dataframe.read_csv( 142s ~~~~~~~~~~~~~~~~~~~~~~~^ 142s urlpath, storage_options=self._storage_options, 142s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 142s **self._csv_kwargs) 142s ^^^^^^^^^^^^^^^^^^^ 142s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 142s raise exc from e 142s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 142s Original Message: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests//entry1_*.csv resolved to no files 142s 400 POST /v1/source (::1): Discover failed 142s 400 POST /v1/source (::1) 2.37ms 142s _____________________________ test_remote_sequence _____________________________ 142s 142s intake_server = 'intake://localhost:7483' 142s 142s def test_remote_sequence(intake_server): 142s import glob 142s d = os.path.dirname(TEST_CATALOG_PATH) 142s catalog = open_catalog(intake_server) 142s assert 'text' in catalog 142s s = catalog.text() 142s s.discover() 142s > assert s.npartitions == len(glob.glob(os.path.join(d, '*.yml'))) 142s E AssertionError: assert 0 == 29 142s E + where 0 = sources:\n text:\n args:\n dtype: null\n extra_metadata:\n catalog_dir: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests/\n headers:\n headers: {}\n name: text\n npartitions: 0\n parameters: {}\n shape:\n - null\n source_id: b5b0348e-cc31-46fb-8411-a2085b8b7b55\n url: http://localhost:7483/\n description: textfiles in this dir\n driver: intake.container.semistructured.RemoteSequenceSource\n metadata:\n catalog_dir: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests/\n.npartitions 142s E + and 29 = len(['/tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests/plugins_source_non_string.yml', '/tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests/obsolete_data_source_list.yml', '/tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests/plugins_source_non_dict.yml', '/tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests/params_value_non_dict.yml', '/tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests/catalog_union_2.yml', '/tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests/catalog_dup_sources.yml', ...]) 142s E + where ['/tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests/plugins_source_non_string.yml', '/tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests/obsolete_data_source_list.yml', '/tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests/plugins_source_non_dict.yml', '/tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests/params_value_non_dict.yml', '/tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests/catalog_union_2.yml', '/tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests/catalog_dup_sources.yml', ...] = ('/tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests/*.yml') 142s E + where = .glob 142s E + and '/tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests/*.yml' = ('/tmp/autopkgtest.SS0vJB/build.nia/src/intake/catalog/tests', '*.yml') 142s E + where = .join 142s E + where = os.path 142s 142s intake/catalog/tests/test_remote_integration.py:263: AssertionError 142s ___________________________________ test_dir ___________________________________ 142s 142s intake_server = 'intake://localhost:7483' 142s 142s def test_dir(intake_server): 142s PAGE_SIZE = 2 142s catalog = open_catalog(intake_server, page_size=PAGE_SIZE) 142s assert len(catalog._entries._page_cache) == 0 142s assert len(catalog._entries._direct_lookup_cache) == 0 142s assert not catalog._entries.complete 142s 142s with pytest.warns(UserWarning, match="Tab-complete"): 142s key_completions = catalog._ipython_key_completions_() 142s with pytest.warns(UserWarning, match="Tab-complete"): 142s dir_ = dir(catalog) 142s # __dir__ triggers loading the first page. 142s assert len(catalog._entries._page_cache) == 2 142s assert len(catalog._entries._direct_lookup_cache) == 0 142s assert not catalog._entries.complete 142s assert set(key_completions) == set(['use_example1', 'nested']) 142s assert 'metadata' in dir_ # a normal attribute 142s assert 'use_example1' in dir_ # an entry from the first page 142s assert 'arr' not in dir_ # an entry we haven't cached yet 142s 142s # Trigger fetching one specific name. 142s catalog['arr'] 142s with pytest.warns(UserWarning, match="Tab-complete"): 142s dir_ = dir(catalog) 142s with pytest.warns(UserWarning, match="Tab-complete"): 142s key_completions = catalog._ipython_key_completions_() 142s assert 'metadata' in dir_ 142s assert 'arr' in dir_ # an entry cached via direct access 142s assert 'arr' in key_completions 142s 142s # Load everything. 142s list(catalog) 142s assert catalog._entries.complete 142s > with pytest.warns(None) as record: 142s 142s intake/catalog/tests/test_remote_integration.py:338: 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s 142s self = WarningsChecker(record=True), expected_warning = None, match_expr = None 142s 142s def __init__( 142s self, 142s expected_warning: type[Warning] | tuple[type[Warning], ...] = Warning, 142s match_expr: str | Pattern[str] | None = None, 142s *, 142s _ispytest: bool = False, 142s ) -> None: 142s check_ispytest(_ispytest) 142s super().__init__(_ispytest=True) 142s 142s msg = "exceptions must be derived from Warning, not %s" 142s if isinstance(expected_warning, tuple): 142s for exc in expected_warning: 142s if not issubclass(exc, Warning): 142s raise TypeError(msg % type(exc)) 142s expected_warning_tup = expected_warning 142s elif isinstance(expected_warning, type) and issubclass( 142s expected_warning, Warning 142s ): 142s expected_warning_tup = (expected_warning,) 142s else: 142s > raise TypeError(msg % type(expected_warning)) 142s E TypeError: exceptions must be derived from Warning, not 142s 142s /usr/lib/python3/dist-packages/_pytest/recwarn.py:279: TypeError 142s ________________________________ test_discover _________________________________ 142s 142s def test_discover(): 142s cmd = [ex, '-m', 'intake.cli.client', 'discover', TEST_CATALOG_YAML, 142s 'entry1'] 142s process = subprocess.Popen(cmd, stdout=subprocess.PIPE, 142s universal_newlines=True) 142s out, _ = process.communicate() 142s 142s > assert "'dtype':" in out 142s E assert "'dtype':" in '' 142s 142s intake/cli/client/tests/test_local_integration.py:89: AssertionError 142s ----------------------------- Captured stderr call ----------------------------- 142s /usr/lib/python3/dist-packages/dask/dataframe/__init__.py:49: FutureWarning: 142s Dask dataframe query planning is disabled because dask-expr is not installed. 142s 142s You can install it with `pip install dask[dataframe]` or `conda install dask`. 142s This will raise in a future version. 142s 142s warnings.warn(msg, FutureWarning) 142s ERROR: OSError('An error occurred while calling the read_csv method registered to the pandas backend.\nOriginal Message: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/cli/client/tests//entry1_*.csv resolved to no files') 142s ________________________________ test_get_pass _________________________________ 142s 142s def test_get_pass(): 142s cmd = [ex, '-m', 'intake.cli.client', 'get', TEST_CATALOG_YAML, 'entry1'] 142s process = subprocess.Popen(cmd, stdout=subprocess.PIPE, 142s universal_newlines=True) 142s out, _ = process.communicate() 142s 142s > assert 'Charlie1 25.0 3' in out 142s E AssertionError: assert 'Charlie1 25.0 3' in '' 142s 142s intake/cli/client/tests/test_local_integration.py:101: AssertionError 142s ----------------------------- Captured stderr call ----------------------------- 142s /usr/lib/python3/dist-packages/dask/dataframe/__init__.py:49: FutureWarning: 142s Dask dataframe query planning is disabled because dask-expr is not installed. 142s 142s You can install it with `pip install dask[dataframe]` or `conda install dask`. 142s This will raise in a future version. 142s 142s warnings.warn(msg, FutureWarning) 142s ERROR: OSError('An error occurred while calling the read_csv method registered to the pandas backend.\nOriginal Message: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/cli/client/tests//entry1_*.csv resolved to no files') 142s ______________________ TestServerV1Source.test_idle_timer ______________________ 142s 142s self = 142s 142s def test_idle_timer(self): 142s self.server.start_periodic_functions(close_idle_after=0.1, 142s remove_idle_after=0.2) 142s 142s msg = dict(action='open', name='entry1', parameters={}) 142s > resp_msg, = self.make_post_request(msg) 142s 142s intake/cli/server/tests/test_server.py:208: 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s intake/cli/server/tests/test_server.py:96: in make_post_request 142s self.assertEqual(response.code, expected_status) 142s E AssertionError: 400 != 200 142s ----------------------------- Captured stderr call ----------------------------- 142s Traceback (most recent call last): 142s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 142s return func(*args, **kwargs) 142s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 142s return read_pandas( 142s reader, 142s ...<10 lines>... 142s **kwargs, 142s ) 142s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 142s raise OSError(f"{urlpath} resolved to no files") 142s OSError: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/cli/server/tests//entry1_*.csv resolved to no files 142s 142s The above exception was the direct cause of the following exception: 142s 142s Traceback (most recent call last): 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/cli/server/server.py", line 306, in post 142s source.discover() 142s ~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/base.py", line 347, in discover 142s self._load_metadata() 142s ~~~~~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/base.py", line 285, in _load_metadata 142s self._schema = self._get_schema() 142s ~~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/csv.py", line 115, in _get_schema 142s self._open_dataset(urlpath) 142s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/csv.py", line 94, in _open_dataset 142s self._dataframe = dask.dataframe.read_csv( 142s ~~~~~~~~~~~~~~~~~~~~~~~^ 142s urlpath, storage_options=self._storage_options, 142s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 142s **self._csv_kwargs) 142s ^^^^^^^^^^^^^^^^^^^ 142s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 142s raise exc from e 142s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 142s Original Message: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/cli/server/tests//entry1_*.csv resolved to no files 142s ------------------------------ Captured log call ------------------------------- 142s WARNING tornado.general:web.py:1932 400 POST /v1/source (127.0.0.1): Discover failed 142s WARNING tornado.access:web.py:2407 400 POST /v1/source (127.0.0.1) 3.92ms 142s ______________________ TestServerV1Source.test_no_format _______________________ 142s 142s self = 142s 142s def test_no_format(self): 142s msg = dict(action='open', name='entry1', parameters={}) 142s > resp_msg, = self.make_post_request(msg) 142s 142s intake/cli/server/tests/test_server.py:195: 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s intake/cli/server/tests/test_server.py:96: in make_post_request 142s self.assertEqual(response.code, expected_status) 142s E AssertionError: 400 != 200 142s ----------------------------- Captured stderr call ----------------------------- 142s Traceback (most recent call last): 142s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 142s return func(*args, **kwargs) 142s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 142s return read_pandas( 142s reader, 142s ...<10 lines>... 142s **kwargs, 142s ) 142s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 142s raise OSError(f"{urlpath} resolved to no files") 142s OSError: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/cli/server/tests//entry1_*.csv resolved to no files 142s 142s The above exception was the direct cause of the following exception: 142s 142s Traceback (most recent call last): 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/cli/server/server.py", line 306, in post 142s source.discover() 142s ~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/base.py", line 347, in discover 142s self._load_metadata() 142s ~~~~~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/base.py", line 285, in _load_metadata 142s self._schema = self._get_schema() 142s ~~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/csv.py", line 115, in _get_schema 142s self._open_dataset(urlpath) 142s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/csv.py", line 94, in _open_dataset 142s self._dataframe = dask.dataframe.read_csv( 142s ~~~~~~~~~~~~~~~~~~~~~~~^ 142s urlpath, storage_options=self._storage_options, 142s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 142s **self._csv_kwargs) 142s ^^^^^^^^^^^^^^^^^^^ 142s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 142s raise exc from e 142s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 142s Original Message: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/cli/server/tests//entry1_*.csv resolved to no files 142s ------------------------------ Captured log call ------------------------------- 142s WARNING tornado.general:web.py:1932 400 POST /v1/source (127.0.0.1): Discover failed 142s WARNING tornado.access:web.py:2407 400 POST /v1/source (127.0.0.1) 3.16ms 142s _________________________ TestServerV1Source.test_open _________________________ 142s 142s self = 142s 142s def test_open(self): 142s msg = dict(action='open', name='entry1', parameters={}) 142s > resp_msg, = self.make_post_request(msg) 142s 142s intake/cli/server/tests/test_server.py:112: 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s intake/cli/server/tests/test_server.py:96: in make_post_request 142s self.assertEqual(response.code, expected_status) 142s E AssertionError: 400 != 200 142s ----------------------------- Captured stderr call ----------------------------- 142s Traceback (most recent call last): 142s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 142s return func(*args, **kwargs) 142s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 142s return read_pandas( 142s reader, 142s ...<10 lines>... 142s **kwargs, 142s ) 142s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 142s raise OSError(f"{urlpath} resolved to no files") 142s OSError: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/cli/server/tests//entry1_*.csv resolved to no files 142s 142s The above exception was the direct cause of the following exception: 142s 142s Traceback (most recent call last): 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/cli/server/server.py", line 306, in post 142s source.discover() 142s ~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/base.py", line 347, in discover 142s self._load_metadata() 142s ~~~~~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/base.py", line 285, in _load_metadata 142s self._schema = self._get_schema() 142s ~~~~~~~~~~~~~~~~^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/csv.py", line 115, in _get_schema 142s self._open_dataset(urlpath) 142s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 142s File "/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/csv.py", line 94, in _open_dataset 142s self._dataframe = dask.dataframe.read_csv( 142s ~~~~~~~~~~~~~~~~~~~~~~~^ 142s urlpath, storage_options=self._storage_options, 142s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 142s **self._csv_kwargs) 142s ^^^^^^^^^^^^^^^^^^^ 142s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 142s raise exc from e 142s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 142s Original Message: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/cli/server/tests//entry1_*.csv resolved to no files 142s ------------------------------ Captured log call ------------------------------- 142s WARNING tornado.general:web.py:1932 400 POST /v1/source (127.0.0.1): Discover failed 142s WARNING tornado.access:web.py:2407 400 POST /v1/source (127.0.0.1) 3.21ms 142s ________________________________ test_other_cat ________________________________ 142s 142s args = ('/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/tests/../../catalog/tests//entry1_*.csv',) 142s kwargs = {'storage_options': None} 142s func = .read at 0x3ff80209080> 142s exc = OSError('An error occurred while calling the read_csv method registered to the pandas backend.\nOriginal Message: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/tests/../../catalog/tests//entry1_*.csv resolved to no files') 142s 142s @wraps(fn) 142s def wrapper(*args, **kwargs): 142s func = getattr(self, dispatch_name) 142s try: 142s > return func(*args, **kwargs) 142s 142s /usr/lib/python3/dist-packages/dask/backends.py:140: 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s /usr/lib/python3/dist-packages/dask/dataframe/io/csv.py:877: in read 142s return read_pandas( 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s 142s reader = 142s urlpath = '/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/tests/../../catalog/tests//entry1_*.csv' 142s blocksize = 'default', lineterminator = '\n', compression = 'infer' 142s sample = 256000, sample_rows = 10, enforce = False, assume_missing = False 142s storage_options = None, include_path_column = False, kwargs = {} 142s reader_name = 'read_csv', b_lineterminator = b'\n', kw = 'chunksize' 142s lastskiprow = 0, firstrow = 0 142s 142s def read_pandas( 142s reader, 142s urlpath, 142s blocksize="default", 142s lineterminator=None, 142s compression="infer", 142s sample=256000, 142s sample_rows=10, 142s enforce=False, 142s assume_missing=False, 142s storage_options=None, 142s include_path_column=False, 142s **kwargs, 142s ): 142s reader_name = reader.__name__ 142s if lineterminator is not None and len(lineterminator) == 1: 142s kwargs["lineterminator"] = lineterminator 142s else: 142s lineterminator = "\n" 142s if "encoding" in kwargs: 142s b_lineterminator = lineterminator.encode(kwargs["encoding"]) 142s empty_blob = "".encode(kwargs["encoding"]) 142s if empty_blob: 142s # This encoding starts with a Byte Order Mark (BOM), so strip that from the 142s # start of the line terminator, since this value is not a full file. 142s b_lineterminator = b_lineterminator[len(empty_blob) :] 142s else: 142s b_lineterminator = lineterminator.encode() 142s if include_path_column and isinstance(include_path_column, bool): 142s include_path_column = "path" 142s if "index" in kwargs or ( 142s "index_col" in kwargs and kwargs.get("index_col") is not False 142s ): 142s raise ValueError( 142s "Keywords 'index' and 'index_col' not supported, except for " 142s "'index_col=False'. Use dd.{reader_name}(...).set_index('my-index') instead" 142s ) 142s for kw in ["iterator", "chunksize"]: 142s if kw in kwargs: 142s raise ValueError(f"{kw} not supported for dd.{reader_name}") 142s if kwargs.get("nrows", None): 142s raise ValueError( 142s "The 'nrows' keyword is not supported by " 142s "`dd.{0}`. To achieve the same behavior, it's " 142s "recommended to use `dd.{0}(...)." 142s "head(n=nrows)`".format(reader_name) 142s ) 142s if isinstance(kwargs.get("skiprows"), int): 142s lastskiprow = firstrow = kwargs.get("skiprows") 142s elif kwargs.get("skiprows") is None: 142s lastskiprow = firstrow = 0 142s else: 142s # When skiprows is a list, we expect more than max(skiprows) to 142s # be included in the sample. This means that [0,2] will work well, 142s # but [0, 440] might not work. 142s skiprows = set(kwargs.get("skiprows")) 142s lastskiprow = max(skiprows) 142s # find the firstrow that is not skipped, for use as header 142s firstrow = min(set(range(len(skiprows) + 1)) - set(skiprows)) 142s if isinstance(kwargs.get("header"), list): 142s raise TypeError(f"List of header rows not supported for dd.{reader_name}") 142s if isinstance(kwargs.get("converters"), dict) and include_path_column: 142s path_converter = kwargs.get("converters").get(include_path_column, None) 142s else: 142s path_converter = None 142s 142s # If compression is "infer", inspect the (first) path suffix and 142s # set the proper compression option if the suffix is recognized. 142s if compression == "infer": 142s # Translate the input urlpath to a simple path list 142s paths = get_fs_token_paths(urlpath, mode="rb", storage_options=storage_options)[ 142s 2 142s ] 142s 142s # Check for at least one valid path 142s if len(paths) == 0: 142s > raise OSError(f"{urlpath} resolved to no files") 142s E OSError: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/tests/../../catalog/tests//entry1_*.csv resolved to no files 142s 142s /usr/lib/python3/dist-packages/dask/dataframe/io/csv.py:644: OSError 142s 142s The above exception was the direct cause of the following exception: 142s 142s def test_other_cat(): 142s cat = intake.open_catalog(catfile) 142s > df1 = cat.other_cat.read() 142s 142s intake/source/tests/test_derived.py:35: 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s intake/source/derived.py:252: in read 142s return self.to_dask().compute() 142s intake/source/derived.py:239: in to_dask 142s self._df = self._transform(self._source.to_dask(), 142s intake/source/csv.py:133: in to_dask 142s self._get_schema() 142s intake/source/csv.py:115: in _get_schema 142s self._open_dataset(urlpath) 142s intake/source/csv.py:94: in _open_dataset 142s self._dataframe = dask.dataframe.read_csv( 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s 142s args = ('/tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/tests/../../catalog/tests//entry1_*.csv',) 142s kwargs = {'storage_options': None} 142s func = .read at 0x3ff80209080> 142s exc = OSError('An error occurred while calling the read_csv method registered to the pandas backend.\nOriginal Message: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/tests/../../catalog/tests//entry1_*.csv resolved to no files') 142s 142s @wraps(fn) 142s def wrapper(*args, **kwargs): 142s func = getattr(self, dispatch_name) 142s try: 142s return func(*args, **kwargs) 142s except Exception as e: 142s try: 142s exc = type(e)( 142s f"An error occurred while calling the {funcname(func)} " 142s f"method registered to the {self.backend} backend.\n" 142s f"Original Message: {e}" 142s ) 142s except TypeError: 142s raise e 142s else: 142s > raise exc from e 142s E OSError: An error occurred while calling the read_csv method registered to the pandas backend. 142s E Original Message: /tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/tests/../../catalog/tests//entry1_*.csv resolved to no files 142s 142s /usr/lib/python3/dist-packages/dask/backends.py:151: OSError 142s ______________________________ test_text_persist _______________________________ 142s 142s temp_cache = None 142s 142s def test_text_persist(temp_cache): 142s cat = intake.open_catalog(os.path.join(here, 'sources.yaml')) 142s s = cat.sometext() 142s > s2 = s.persist() 142s 142s intake/source/tests/test_text.py:88: 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s intake/source/base.py:226: in persist 142s out = self._export(store.getdir(self), **kwargs) 142s intake/source/base.py:460: in _export 142s out = method(self, path=path, **kwargs) 142s intake/container/semistructured.py:70: in _persist 142s return RemoteSequenceSource._data_to_source(b, path, encoder, **kwargs) 142s intake/container/semistructured.py:90: in _data_to_source 142s files = open_files(posixpath.join(path, 'part.*'), mode='wt', 142s /usr/lib/python3/dist-packages/fsspec/core.py:295: in open_files 142s fs, fs_token, paths = get_fs_token_paths( 142s /usr/lib/python3/dist-packages/fsspec/core.py:684: in get_fs_token_paths 142s paths = _expand_paths(paths, name_function, num) 142s /usr/lib/python3/dist-packages/fsspec/core.py:701: in _expand_paths 142s name_function = build_name_function(num - 1) 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s 142s max_int = -0.99999999 142s 142s def build_name_function(max_int: float) -> Callable[[int], str]: 142s """Returns a function that receives a single integer 142s and returns it as a string padded by enough zero characters 142s to align with maximum possible integer 142s 142s >>> name_f = build_name_function(57) 142s 142s >>> name_f(7) 142s '07' 142s >>> name_f(31) 142s '31' 142s >>> build_name_function(1000)(42) 142s '0042' 142s >>> build_name_function(999)(42) 142s '042' 142s >>> build_name_function(0)(0) 142s '0' 142s """ 142s # handle corner cases max_int is 0 or exact power of 10 142s max_int += 1e-8 142s 142s > pad_length = int(math.ceil(math.log10(max_int))) 142s E ValueError: math domain error 142s 142s /usr/lib/python3/dist-packages/fsspec/utils.py:177: ValueError 142s _______________________________ test_text_export _______________________________ 142s 142s temp_cache = None 142s 142s def test_text_export(temp_cache): 142s import tempfile 142s outdir = tempfile.mkdtemp() 142s cat = intake.open_catalog(os.path.join(here, 'sources.yaml')) 142s s = cat.sometext() 142s > out = s.export(outdir) 142s 142s intake/source/tests/test_text.py:97: 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s intake/source/base.py:452: in export 142s return self._export(path, **kwargs) 142s intake/source/base.py:460: in _export 142s out = method(self, path=path, **kwargs) 142s intake/container/semistructured.py:70: in _persist 142s return RemoteSequenceSource._data_to_source(b, path, encoder, **kwargs) 142s intake/container/semistructured.py:90: in _data_to_source 142s files = open_files(posixpath.join(path, 'part.*'), mode='wt', 142s /usr/lib/python3/dist-packages/fsspec/core.py:295: in open_files 142s fs, fs_token, paths = get_fs_token_paths( 142s /usr/lib/python3/dist-packages/fsspec/core.py:684: in get_fs_token_paths 142s paths = _expand_paths(paths, name_function, num) 142s /usr/lib/python3/dist-packages/fsspec/core.py:701: in _expand_paths 142s name_function = build_name_function(num - 1) 142s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 142s 142s max_int = -0.99999999 142s 142s def build_name_function(max_int: float) -> Callable[[int], str]: 142s """Returns a function that receives a single integer 142s and returns it as a string padded by enough zero characters 142s to align with maximum possible integer 142s 142s >>> name_f = build_name_function(57) 142s 142s >>> name_f(7) 142s '07' 142s >>> name_f(31) 142s '31' 142s >>> build_name_function(1000)(42) 142s '0042' 142s >>> build_name_function(999)(42) 142s '042' 142s >>> build_name_function(0)(0) 142s '0' 142s """ 142s # handle corner cases max_int is 0 or exact power of 10 142s max_int += 1e-8 142s 142s > pad_length = int(math.ceil(math.log10(max_int))) 142s E ValueError: math domain error 142s 142s /usr/lib/python3/dist-packages/fsspec/utils.py:177: ValueError 142s =============================== warnings summary =============================== 142s intake/catalog/tests/test_alias.py::test_simple 142s /usr/lib/python3/dist-packages/dask/dataframe/__init__.py:49: FutureWarning: 142s Dask dataframe query planning is disabled because dask-expr is not installed. 142s 142s You can install it with `pip install dask[dataframe]` or `conda install dask`. 142s This will raise in a future version. 142s 142s warnings.warn(msg, FutureWarning) 142s 142s intake/source/tests/test_cache.py::test_filtered_compressed_cache 142s intake/source/tests/test_cache.py::test_compressions[tgz] 142s intake/source/tests/test_cache.py::test_compressions[tgz] 142s /tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/decompress.py:27: DeprecationWarning: Python 3.14 will, by default, filter extracted tar archives and reject files or modify their metadata. Use the filter argument to control this behavior. 142s tar.extractall(outpath) 142s 142s intake/source/tests/test_cache.py::test_compressions[tbz] 142s intake/source/tests/test_cache.py::test_compressions[tbz] 142s /tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/decompress.py:37: DeprecationWarning: Python 3.14 will, by default, filter extracted tar archives and reject files or modify their metadata. Use the filter argument to control this behavior. 142s tar.extractall(outpath) 142s 142s intake/source/tests/test_cache.py::test_compressions[tar] 142s intake/source/tests/test_cache.py::test_compressions[tar] 142s /tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/decompress.py:47: DeprecationWarning: Python 3.14 will, by default, filter extracted tar archives and reject files or modify their metadata. Use the filter argument to control this behavior. 142s tar.extractall(outpath) 142s 142s intake/source/tests/test_discovery.py::test_package_scan 142s intake/source/tests/test_discovery.py::test_package_scan 142s intake/source/tests/test_discovery.py::test_enable_and_disable 142s intake/source/tests/test_discovery.py::test_discover_collision 142s /tmp/autopkgtest.SS0vJB/build.nia/src/intake/source/discovery.py:194: PendingDeprecationWarning: Package scanning may be removed 142s warnings.warn("Package scanning may be removed", category=PendingDeprecationWarning) 142s 142s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 142s =========================== short test summary info ============================ 142s FAILED intake/catalog/tests/test_caching_integration.py::test_load_textfile 142s FAILED intake/catalog/tests/test_local.py::test_nested - OSError: An error oc... 142s FAILED intake/catalog/tests/test_remote_integration.py::test_info_describe - ... 142s FAILED intake/catalog/tests/test_remote_integration.py::test_remote_direct - ... 142s FAILED intake/catalog/tests/test_remote_integration.py::test_remote_datasource_interface 142s FAILED intake/catalog/tests/test_remote_integration.py::test_read - Exception... 142s FAILED intake/catalog/tests/test_remote_integration.py::test_read_chunks - Ex... 142s FAILED intake/catalog/tests/test_remote_integration.py::test_read_partition 142s FAILED intake/catalog/tests/test_remote_integration.py::test_close - Exceptio... 142s FAILED intake/catalog/tests/test_remote_integration.py::test_with - Exception... 142s FAILED intake/catalog/tests/test_remote_integration.py::test_pickle - Excepti... 142s FAILED intake/catalog/tests/test_remote_integration.py::test_to_dask - Except... 142s FAILED intake/catalog/tests/test_remote_integration.py::test_remote_sequence 142s FAILED intake/catalog/tests/test_remote_integration.py::test_dir - TypeError:... 142s FAILED intake/cli/client/tests/test_local_integration.py::test_discover - ass... 142s FAILED intake/cli/client/tests/test_local_integration.py::test_get_pass - Ass... 142s FAILED intake/cli/server/tests/test_server.py::TestServerV1Source::test_idle_timer 142s FAILED intake/cli/server/tests/test_server.py::TestServerV1Source::test_no_format 142s FAILED intake/cli/server/tests/test_server.py::TestServerV1Source::test_open 142s FAILED intake/source/tests/test_derived.py::test_other_cat - OSError: An erro... 142s FAILED intake/source/tests/test_text.py::test_text_persist - ValueError: math... 142s FAILED intake/source/tests/test_text.py::test_text_export - ValueError: math ... 142s ERROR intake/interface/tests/test_init_gui.py::test_no_panel_does_not_raise_errors 142s ERROR intake/interface/tests/test_init_gui.py::test_no_panel_display_init_gui 142s ERROR intake/interface/tests/test_init_gui.py::test_display_init_gui - KeyErr... 142s ====== 22 failed, 379 passed, 31 skipped, 12 warnings, 3 errors in 40.11s ====== 142s autopkgtest [16:44:02]: test run-unit-test: -----------------------] 143s autopkgtest [16:44:03]: test run-unit-test: - - - - - - - - - - results - - - - - - - - - - 143s run-unit-test FAIL non-zero exit status 1 143s autopkgtest [16:44:03]: @@@@@@@@@@@@@@@@@@@@ summary 143s run-unit-test FAIL non-zero exit status 1