0s autopkgtest [16:48:31]: starting date and time: 2025-11-17 16:48:31+0000 0s autopkgtest [16:48:31]: git checkout: 4b346b80 nova: make wait_reboot return success even when a no-op 0s autopkgtest [16:48:31]: host juju-7f2275-prod-proposed-migration-environment-20; command line: /home/ubuntu/autopkgtest/runner/autopkgtest --output-dir /tmp/autopkgtest-work.b1ifcxg6/out --timeout-copy=6000 --setup-commands /home/ubuntu/autopkgtest-cloud/worker-config-production/setup-canonical.sh --apt-pocket=proposed=src:intake --apt-upgrade intake --timeout-short=300 --timeout-copy=20000 --timeout-build=20000 --env=ADT_TEST_TRIGGERS=intake/0.6.6-4 -- ssh -s /home/ubuntu/autopkgtest/ssh-setup/nova -- --flavor autopkgtest-cpu2-ram4-disk20-amd64 --security-groups autopkgtest-juju-7f2275-prod-proposed-migration-environment-20@sto01-3.secgroup --name adt-resolute-amd64-intake-20251117-164831-juju-7f2275-prod-proposed-migration-environment-20-c6147c00-4d1d-47e1-96e2-db9fd5657f92 --image adt/ubuntu-resolute-amd64-server --keyname testbed-juju-7f2275-prod-proposed-migration-environment-20 --net-id=net_prod-autopkgtest-workers-amd64 -e TERM=linux --mirror=http://ftpmaster.internal/ubuntu/ 4s Creating nova instance adt-resolute-amd64-intake-20251117-164831-juju-7f2275-prod-proposed-migration-environment-20-c6147c00-4d1d-47e1-96e2-db9fd5657f92 from image adt/ubuntu-resolute-amd64-server-20251117.img (UUID 9762b0cc-7c5b-4854-acd5-cc74ad0de8c6)... 43s autopkgtest [16:49:14]: testbed dpkg architecture: amd64 43s autopkgtest [16:49:14]: testbed apt version: 3.1.11 43s autopkgtest [16:49:14]: @@@@@@@@@@@@@@@@@@@@ test bed setup 44s autopkgtest [16:49:15]: testbed release detected to be: None 44s autopkgtest [16:49:15]: updating testbed package index (apt update) 44s Get:1 http://ftpmaster.internal/ubuntu resolute-proposed InRelease [87.8 kB] 44s Hit:2 http://ftpmaster.internal/ubuntu resolute InRelease 44s Hit:3 http://ftpmaster.internal/ubuntu resolute-updates InRelease 45s Hit:4 http://ftpmaster.internal/ubuntu resolute-security InRelease 45s Get:5 http://ftpmaster.internal/ubuntu resolute-proposed/multiverse Sources [22.9 kB] 45s Get:6 http://ftpmaster.internal/ubuntu resolute-proposed/restricted Sources [9852 B] 45s Get:7 http://ftpmaster.internal/ubuntu resolute-proposed/main Sources [73.2 kB] 45s Get:8 http://ftpmaster.internal/ubuntu resolute-proposed/universe Sources [779 kB] 45s Get:9 http://ftpmaster.internal/ubuntu resolute-proposed/main i386 Packages [113 kB] 45s Get:10 http://ftpmaster.internal/ubuntu resolute-proposed/main amd64 Packages [153 kB] 45s Get:11 http://ftpmaster.internal/ubuntu resolute-proposed/main amd64 c-n-f Metadata [3236 B] 45s Get:12 http://ftpmaster.internal/ubuntu resolute-proposed/restricted amd64 Packages [64.6 kB] 45s Get:13 http://ftpmaster.internal/ubuntu resolute-proposed/restricted i386 Packages [3744 B] 45s Get:14 http://ftpmaster.internal/ubuntu resolute-proposed/restricted amd64 c-n-f Metadata [336 B] 45s Get:15 http://ftpmaster.internal/ubuntu resolute-proposed/universe amd64 Packages [543 kB] 45s Get:16 http://ftpmaster.internal/ubuntu resolute-proposed/universe i386 Packages [254 kB] 45s Get:17 http://ftpmaster.internal/ubuntu resolute-proposed/universe amd64 c-n-f Metadata [20.1 kB] 45s Get:18 http://ftpmaster.internal/ubuntu resolute-proposed/multiverse i386 Packages [6516 B] 45s Get:19 http://ftpmaster.internal/ubuntu resolute-proposed/multiverse amd64 Packages [13.4 kB] 45s Get:20 http://ftpmaster.internal/ubuntu resolute-proposed/multiverse amd64 c-n-f Metadata [680 B] 47s Fetched 2149 kB in 1s (2293 kB/s) 47s Reading package lists... 48s Hit:1 http://ftpmaster.internal/ubuntu resolute-proposed InRelease 48s Hit:2 http://ftpmaster.internal/ubuntu resolute InRelease 48s Hit:3 http://ftpmaster.internal/ubuntu resolute-updates InRelease 48s Hit:4 http://ftpmaster.internal/ubuntu resolute-security InRelease 48s Reading package lists... 48s Reading package lists... 48s Building dependency tree... 48s Reading state information... 48s Calculating upgrade... 49s The following packages will be upgraded: 49s apt libapt-pkg7.0 libcrypt-dev libcrypt1 libunwind8 usbutils 49s 6 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 49s Need to get 2978 kB of archives. 49s After this operation, 46.1 kB of additional disk space will be used. 49s Get:1 http://ftpmaster.internal/ubuntu resolute/main amd64 libcrypt-dev amd64 1:4.5.1-1 [122 kB] 49s Get:2 http://ftpmaster.internal/ubuntu resolute/main amd64 libcrypt1 amd64 1:4.5.1-1 [90.7 kB] 49s Get:3 http://ftpmaster.internal/ubuntu resolute/main amd64 libapt-pkg7.0 amd64 3.1.12 [1148 kB] 49s Get:4 http://ftpmaster.internal/ubuntu resolute/main amd64 apt amd64 3.1.12 [1474 kB] 49s Get:5 http://ftpmaster.internal/ubuntu resolute/main amd64 usbutils amd64 1:019-1 [83.9 kB] 49s Get:6 http://ftpmaster.internal/ubuntu resolute/main amd64 libunwind8 amd64 1.8.3-0ubuntu1 [59.6 kB] 49s dpkg-preconfigure: unable to re-open stdin: No such file or directory 49s Fetched 2978 kB in 0s (7279 kB/s) 49s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 83372 files and directories currently installed.) 49s Preparing to unpack .../libcrypt-dev_1%3a4.5.1-1_amd64.deb ... 49s Unpacking libcrypt-dev:amd64 (1:4.5.1-1) over (1:4.4.38-1build1) ... 49s Preparing to unpack .../libcrypt1_1%3a4.5.1-1_amd64.deb ... 49s Unpacking libcrypt1:amd64 (1:4.5.1-1) over (1:4.4.38-1build1) ... 49s Setting up libcrypt1:amd64 (1:4.5.1-1) ... 49s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 83372 files and directories currently installed.) 49s Preparing to unpack .../libapt-pkg7.0_3.1.12_amd64.deb ... 49s Unpacking libapt-pkg7.0:amd64 (3.1.12) over (3.1.11) ... 49s Preparing to unpack .../archives/apt_3.1.12_amd64.deb ... 50s Unpacking apt (3.1.12) over (3.1.11) ... 50s Preparing to unpack .../usbutils_1%3a019-1_amd64.deb ... 50s Unpacking usbutils (1:019-1) over (1:018-2) ... 50s Preparing to unpack .../libunwind8_1.8.3-0ubuntu1_amd64.deb ... 50s Unpacking libunwind8:amd64 (1.8.3-0ubuntu1) over (1.8.1-0.1ubuntu1) ... 50s Setting up libunwind8:amd64 (1.8.3-0ubuntu1) ... 50s Setting up usbutils (1:019-1) ... 50s Setting up libcrypt-dev:amd64 (1:4.5.1-1) ... 50s Setting up libapt-pkg7.0:amd64 (3.1.12) ... 50s Setting up apt (3.1.12) ... 50s Processing triggers for man-db (2.13.1-1) ... 51s Processing triggers for libc-bin (2.42-2ubuntu2) ... 51s autopkgtest [16:49:22]: upgrading testbed (apt dist-upgrade and autopurge) 52s Reading package lists... 52s Building dependency tree... 52s Reading state information... 52s Calculating upgrade... 52s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 52s Reading package lists... 52s Building dependency tree... 52s Reading state information... 53s Solving dependencies... 53s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 53s autopkgtest [16:49:24]: rebooting testbed after setup commands that affected boot 81s autopkgtest [16:49:52]: testbed running kernel: Linux 6.17.0-5-generic #5-Ubuntu SMP PREEMPT_DYNAMIC Mon Sep 22 10:00:33 UTC 2025 83s autopkgtest [16:49:54]: @@@@@@@@@@@@@@@@@@@@ apt-source intake 84s Get:1 http://ftpmaster.internal/ubuntu resolute-proposed/universe intake 0.6.6-4 (dsc) [2693 B] 84s Get:2 http://ftpmaster.internal/ubuntu resolute-proposed/universe intake 0.6.6-4 (tar) [4447 kB] 84s Get:3 http://ftpmaster.internal/ubuntu resolute-proposed/universe intake 0.6.6-4 (diff) [15.8 kB] 84s gpgv: Signature made Wed Aug 27 08:46:02 2025 UTC 84s gpgv: using RSA key 8F6DE104377F3B11E741748731F3144544A1741A 84s gpgv: issuer "tchet@debian.org" 84s gpgv: Can't check signature: No public key 84s dpkg-source: warning: cannot verify inline signature for ./intake_0.6.6-4.dsc: no acceptable signature found 85s autopkgtest [16:49:56]: testing package intake version 0.6.6-4 85s autopkgtest [16:49:56]: build not needed 85s autopkgtest [16:49:56]: test run-unit-test: preparing testbed 85s Reading package lists... 86s Building dependency tree... 86s Reading state information... 86s Solving dependencies... 86s The following NEW packages will be installed: 86s fonts-font-awesome fonts-glyphicons-halflings fonts-lato libblas3 86s libgfortran5 libjs-bootstrap libjs-jquery libjs-sphinxdoc libjs-underscore 86s liblapack3 node-html5shiv python3-aiohappyeyeballs python3-aiohttp 86s python3-aiosignal python3-all python3-async-timeout python3-click 86s python3-cloudpickle python3-dask python3-entrypoints python3-frozenlist 86s python3-fsspec python3-iniconfig python3-intake python3-intake-doc 86s python3-locket python3-msgpack python3-msgpack-numpy python3-multidict 86s python3-numpy python3-numpy-dev python3-pandas python3-pandas-lib 86s python3-partd python3-platformdirs python3-pluggy python3-propcache 86s python3-pytest python3-pytz python3-toolz python3-tornado python3-yarl 86s sphinx-rtd-theme-common 86s 0 upgraded, 43 newly installed, 0 to remove and 0 not upgraded. 86s Need to get 29.6 MB of archives. 86s After this operation, 155 MB of additional disk space will be used. 86s Get:1 http://ftpmaster.internal/ubuntu resolute/main amd64 fonts-lato all 2.015-1 [2781 kB] 86s Get:2 http://ftpmaster.internal/ubuntu resolute/main amd64 python3-numpy-dev amd64 1:2.2.4+ds-1ubuntu1 [147 kB] 86s Get:3 http://ftpmaster.internal/ubuntu resolute/main amd64 libblas3 amd64 3.12.1-7 [259 kB] 86s Get:4 http://ftpmaster.internal/ubuntu resolute/main amd64 libgfortran5 amd64 15.2.0-7ubuntu1 [939 kB] 86s Get:5 http://ftpmaster.internal/ubuntu resolute/main amd64 liblapack3 amd64 3.12.1-7 [2739 kB] 86s Get:6 http://ftpmaster.internal/ubuntu resolute/main amd64 python3-numpy amd64 1:2.2.4+ds-1ubuntu1 [5377 kB] 86s Get:7 http://ftpmaster.internal/ubuntu resolute/main amd64 fonts-font-awesome all 5.0.10+really4.7.0~dfsg-4.1 [516 kB] 86s Get:8 http://ftpmaster.internal/ubuntu resolute/universe amd64 fonts-glyphicons-halflings all 1.009~3.4.1+dfsg-6 [119 kB] 86s Get:9 http://ftpmaster.internal/ubuntu resolute/universe amd64 libjs-bootstrap all 3.4.1+dfsg-6 [129 kB] 86s Get:10 http://ftpmaster.internal/ubuntu resolute/main amd64 libjs-jquery all 3.6.1+dfsg+~3.5.14-1 [328 kB] 86s Get:11 http://ftpmaster.internal/ubuntu resolute/main amd64 libjs-underscore all 1.13.4~dfsg+~1.11.4-3 [118 kB] 86s Get:12 http://ftpmaster.internal/ubuntu resolute/main amd64 libjs-sphinxdoc all 8.2.3-1ubuntu2 [28.0 kB] 86s Get:13 http://ftpmaster.internal/ubuntu resolute/universe amd64 node-html5shiv all 3.7.3+dfsg-5 [13.5 kB] 86s Get:14 http://ftpmaster.internal/ubuntu resolute/universe amd64 python3-aiohappyeyeballs all 2.6.1-2 [11.1 kB] 86s Get:15 http://ftpmaster.internal/ubuntu resolute/universe amd64 python3-multidict amd64 6.4.3-1build1 [69.2 kB] 86s Get:16 http://ftpmaster.internal/ubuntu resolute/universe amd64 python3-propcache amd64 0.3.1-1build1 [54.5 kB] 86s Get:17 http://ftpmaster.internal/ubuntu resolute/universe amd64 python3-yarl amd64 1.22.0-1 [98.2 kB] 86s Get:18 http://ftpmaster.internal/ubuntu resolute/universe amd64 python3-async-timeout all 5.0.1-1 [6830 B] 86s Get:19 http://ftpmaster.internal/ubuntu resolute/universe amd64 python3-frozenlist amd64 1.8.0-1 [53.5 kB] 86s Get:20 http://ftpmaster.internal/ubuntu resolute/universe amd64 python3-aiosignal all 1.4.0-1 [5628 B] 86s Get:21 http://ftpmaster.internal/ubuntu resolute/universe amd64 python3-aiohttp amd64 3.11.16-1 [367 kB] 86s Get:22 http://ftpmaster.internal/ubuntu resolute/main amd64 python3-all amd64 3.13.7-1 [884 B] 86s Get:23 http://ftpmaster.internal/ubuntu resolute/main amd64 python3-click all 8.2.0+0.really.8.1.8-1 [80.0 kB] 86s Get:24 http://ftpmaster.internal/ubuntu resolute/universe amd64 python3-cloudpickle all 3.1.1-1 [22.4 kB] 86s Get:25 http://ftpmaster.internal/ubuntu resolute/universe amd64 python3-fsspec all 2025.3.2-1ubuntu1 [217 kB] 86s Get:26 http://ftpmaster.internal/ubuntu resolute/universe amd64 python3-toolz all 1.0.0-2 [45.0 kB] 86s Get:27 http://ftpmaster.internal/ubuntu resolute/universe amd64 python3-locket all 1.0.0-2 [5872 B] 86s Get:28 http://ftpmaster.internal/ubuntu resolute/universe amd64 python3-partd all 1.4.2-1 [15.7 kB] 86s Get:29 http://ftpmaster.internal/ubuntu resolute/universe amd64 python3-dask all 2024.12.1+dfsg-2 [875 kB] 86s Get:30 http://ftpmaster.internal/ubuntu resolute/universe amd64 python3-entrypoints all 0.4-3 [7174 B] 86s Get:31 http://ftpmaster.internal/ubuntu resolute/universe amd64 python3-iniconfig all 2.1.0-1 [6840 B] 86s Get:32 http://ftpmaster.internal/ubuntu resolute/main amd64 python3-msgpack amd64 1.0.3-3build5 [114 kB] 86s Get:33 http://ftpmaster.internal/ubuntu resolute/main amd64 python3-platformdirs all 4.3.7-1 [16.9 kB] 86s Get:34 http://ftpmaster.internal/ubuntu resolute-proposed/universe amd64 python3-intake amd64 0.6.6-4 [197 kB] 86s Get:35 http://ftpmaster.internal/ubuntu resolute/main amd64 sphinx-rtd-theme-common all 3.0.2+dfsg-3 [1013 kB] 87s Get:36 http://ftpmaster.internal/ubuntu resolute-proposed/universe amd64 python3-intake-doc all 0.6.6-4 [1549 kB] 87s Get:37 http://ftpmaster.internal/ubuntu resolute/universe amd64 python3-msgpack-numpy all 0.4.8-1 [7388 B] 87s Get:38 http://ftpmaster.internal/ubuntu resolute/main amd64 python3-pytz all 2025.2-4 [32.3 kB] 87s Get:39 http://ftpmaster.internal/ubuntu resolute/universe amd64 python3-pandas-lib amd64 2.3.3+dfsg-1ubuntu1 [7668 kB] 87s Get:40 http://ftpmaster.internal/ubuntu resolute/universe amd64 python3-pandas all 2.3.3+dfsg-1ubuntu1 [2948 kB] 87s Get:41 http://ftpmaster.internal/ubuntu resolute/universe amd64 python3-pluggy all 1.6.0-1 [21.0 kB] 87s Get:42 http://ftpmaster.internal/ubuntu resolute/universe amd64 python3-pytest all 8.3.5-2 [252 kB] 87s Get:43 http://ftpmaster.internal/ubuntu resolute/main amd64 python3-tornado amd64 6.5.2-3 [304 kB] 87s Fetched 29.6 MB in 1s (28.4 MB/s) 87s Selecting previously unselected package fonts-lato. 87s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 83372 files and directories currently installed.) 87s Preparing to unpack .../00-fonts-lato_2.015-1_all.deb ... 87s Unpacking fonts-lato (2.015-1) ... 87s Selecting previously unselected package python3-numpy-dev:amd64. 87s Preparing to unpack .../01-python3-numpy-dev_1%3a2.2.4+ds-1ubuntu1_amd64.deb ... 87s Unpacking python3-numpy-dev:amd64 (1:2.2.4+ds-1ubuntu1) ... 87s Selecting previously unselected package libblas3:amd64. 87s Preparing to unpack .../02-libblas3_3.12.1-7_amd64.deb ... 87s Unpacking libblas3:amd64 (3.12.1-7) ... 87s Selecting previously unselected package libgfortran5:amd64. 87s Preparing to unpack .../03-libgfortran5_15.2.0-7ubuntu1_amd64.deb ... 87s Unpacking libgfortran5:amd64 (15.2.0-7ubuntu1) ... 87s Selecting previously unselected package liblapack3:amd64. 87s Preparing to unpack .../04-liblapack3_3.12.1-7_amd64.deb ... 87s Unpacking liblapack3:amd64 (3.12.1-7) ... 87s Selecting previously unselected package python3-numpy. 87s Preparing to unpack .../05-python3-numpy_1%3a2.2.4+ds-1ubuntu1_amd64.deb ... 87s Unpacking python3-numpy (1:2.2.4+ds-1ubuntu1) ... 87s Selecting previously unselected package fonts-font-awesome. 87s Preparing to unpack .../06-fonts-font-awesome_5.0.10+really4.7.0~dfsg-4.1_all.deb ... 87s Unpacking fonts-font-awesome (5.0.10+really4.7.0~dfsg-4.1) ... 87s Selecting previously unselected package fonts-glyphicons-halflings. 87s Preparing to unpack .../07-fonts-glyphicons-halflings_1.009~3.4.1+dfsg-6_all.deb ... 87s Unpacking fonts-glyphicons-halflings (1.009~3.4.1+dfsg-6) ... 87s Selecting previously unselected package libjs-bootstrap. 87s Preparing to unpack .../08-libjs-bootstrap_3.4.1+dfsg-6_all.deb ... 87s Unpacking libjs-bootstrap (3.4.1+dfsg-6) ... 87s Selecting previously unselected package libjs-jquery. 87s Preparing to unpack .../09-libjs-jquery_3.6.1+dfsg+~3.5.14-1_all.deb ... 87s Unpacking libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 87s Selecting previously unselected package libjs-underscore. 87s Preparing to unpack .../10-libjs-underscore_1.13.4~dfsg+~1.11.4-3_all.deb ... 87s Unpacking libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... 88s Selecting previously unselected package libjs-sphinxdoc. 88s Preparing to unpack .../11-libjs-sphinxdoc_8.2.3-1ubuntu2_all.deb ... 88s Unpacking libjs-sphinxdoc (8.2.3-1ubuntu2) ... 88s Selecting previously unselected package node-html5shiv. 88s Preparing to unpack .../12-node-html5shiv_3.7.3+dfsg-5_all.deb ... 88s Unpacking node-html5shiv (3.7.3+dfsg-5) ... 88s Selecting previously unselected package python3-aiohappyeyeballs. 88s Preparing to unpack .../13-python3-aiohappyeyeballs_2.6.1-2_all.deb ... 88s Unpacking python3-aiohappyeyeballs (2.6.1-2) ... 88s Selecting previously unselected package python3-multidict. 88s Preparing to unpack .../14-python3-multidict_6.4.3-1build1_amd64.deb ... 88s Unpacking python3-multidict (6.4.3-1build1) ... 88s Selecting previously unselected package python3-propcache. 88s Preparing to unpack .../15-python3-propcache_0.3.1-1build1_amd64.deb ... 88s Unpacking python3-propcache (0.3.1-1build1) ... 88s Selecting previously unselected package python3-yarl. 88s Preparing to unpack .../16-python3-yarl_1.22.0-1_amd64.deb ... 88s Unpacking python3-yarl (1.22.0-1) ... 88s Selecting previously unselected package python3-async-timeout. 88s Preparing to unpack .../17-python3-async-timeout_5.0.1-1_all.deb ... 88s Unpacking python3-async-timeout (5.0.1-1) ... 88s Selecting previously unselected package python3-frozenlist. 88s Preparing to unpack .../18-python3-frozenlist_1.8.0-1_amd64.deb ... 88s Unpacking python3-frozenlist (1.8.0-1) ... 88s Selecting previously unselected package python3-aiosignal. 88s Preparing to unpack .../19-python3-aiosignal_1.4.0-1_all.deb ... 88s Unpacking python3-aiosignal (1.4.0-1) ... 88s Selecting previously unselected package python3-aiohttp. 88s Preparing to unpack .../20-python3-aiohttp_3.11.16-1_amd64.deb ... 88s Unpacking python3-aiohttp (3.11.16-1) ... 88s Selecting previously unselected package python3-all. 88s Preparing to unpack .../21-python3-all_3.13.7-1_amd64.deb ... 88s Unpacking python3-all (3.13.7-1) ... 88s Selecting previously unselected package python3-click. 88s Preparing to unpack .../22-python3-click_8.2.0+0.really.8.1.8-1_all.deb ... 88s Unpacking python3-click (8.2.0+0.really.8.1.8-1) ... 88s Selecting previously unselected package python3-cloudpickle. 88s Preparing to unpack .../23-python3-cloudpickle_3.1.1-1_all.deb ... 88s Unpacking python3-cloudpickle (3.1.1-1) ... 88s Selecting previously unselected package python3-fsspec. 88s Preparing to unpack .../24-python3-fsspec_2025.3.2-1ubuntu1_all.deb ... 88s Unpacking python3-fsspec (2025.3.2-1ubuntu1) ... 88s Selecting previously unselected package python3-toolz. 88s Preparing to unpack .../25-python3-toolz_1.0.0-2_all.deb ... 88s Unpacking python3-toolz (1.0.0-2) ... 88s Selecting previously unselected package python3-locket. 88s Preparing to unpack .../26-python3-locket_1.0.0-2_all.deb ... 88s Unpacking python3-locket (1.0.0-2) ... 88s Selecting previously unselected package python3-partd. 88s Preparing to unpack .../27-python3-partd_1.4.2-1_all.deb ... 88s Unpacking python3-partd (1.4.2-1) ... 88s Selecting previously unselected package python3-dask. 88s Preparing to unpack .../28-python3-dask_2024.12.1+dfsg-2_all.deb ... 88s Unpacking python3-dask (2024.12.1+dfsg-2) ... 88s Selecting previously unselected package python3-entrypoints. 88s Preparing to unpack .../29-python3-entrypoints_0.4-3_all.deb ... 88s Unpacking python3-entrypoints (0.4-3) ... 88s Selecting previously unselected package python3-iniconfig. 88s Preparing to unpack .../30-python3-iniconfig_2.1.0-1_all.deb ... 88s Unpacking python3-iniconfig (2.1.0-1) ... 88s Selecting previously unselected package python3-msgpack. 88s Preparing to unpack .../31-python3-msgpack_1.0.3-3build5_amd64.deb ... 88s Unpacking python3-msgpack (1.0.3-3build5) ... 88s Selecting previously unselected package python3-platformdirs. 88s Preparing to unpack .../32-python3-platformdirs_4.3.7-1_all.deb ... 88s Unpacking python3-platformdirs (4.3.7-1) ... 88s Selecting previously unselected package python3-intake. 88s Preparing to unpack .../33-python3-intake_0.6.6-4_amd64.deb ... 88s Unpacking python3-intake (0.6.6-4) ... 88s Selecting previously unselected package sphinx-rtd-theme-common. 88s Preparing to unpack .../34-sphinx-rtd-theme-common_3.0.2+dfsg-3_all.deb ... 88s Unpacking sphinx-rtd-theme-common (3.0.2+dfsg-3) ... 88s Selecting previously unselected package python3-intake-doc. 88s Preparing to unpack .../35-python3-intake-doc_0.6.6-4_all.deb ... 88s Unpacking python3-intake-doc (0.6.6-4) ... 88s Selecting previously unselected package python3-msgpack-numpy. 88s Preparing to unpack .../36-python3-msgpack-numpy_0.4.8-1_all.deb ... 88s Unpacking python3-msgpack-numpy (0.4.8-1) ... 88s Selecting previously unselected package python3-pytz. 88s Preparing to unpack .../37-python3-pytz_2025.2-4_all.deb ... 88s Unpacking python3-pytz (2025.2-4) ... 88s Selecting previously unselected package python3-pandas-lib:amd64. 88s Preparing to unpack .../38-python3-pandas-lib_2.3.3+dfsg-1ubuntu1_amd64.deb ... 88s Unpacking python3-pandas-lib:amd64 (2.3.3+dfsg-1ubuntu1) ... 88s Selecting previously unselected package python3-pandas. 88s Preparing to unpack .../39-python3-pandas_2.3.3+dfsg-1ubuntu1_all.deb ... 88s Unpacking python3-pandas (2.3.3+dfsg-1ubuntu1) ... 88s Selecting previously unselected package python3-pluggy. 88s Preparing to unpack .../40-python3-pluggy_1.6.0-1_all.deb ... 88s Unpacking python3-pluggy (1.6.0-1) ... 88s Selecting previously unselected package python3-pytest. 88s Preparing to unpack .../41-python3-pytest_8.3.5-2_all.deb ... 88s Unpacking python3-pytest (8.3.5-2) ... 88s Selecting previously unselected package python3-tornado. 88s Preparing to unpack .../42-python3-tornado_6.5.2-3_amd64.deb ... 88s Unpacking python3-tornado (6.5.2-3) ... 88s Setting up python3-entrypoints (0.4-3) ... 88s Setting up python3-iniconfig (2.1.0-1) ... 88s Setting up python3-tornado (6.5.2-3) ... 88s Setting up fonts-lato (2.015-1) ... 88s Setting up fonts-glyphicons-halflings (1.009~3.4.1+dfsg-6) ... 88s Setting up python3-fsspec (2025.3.2-1ubuntu1) ... 89s Setting up node-html5shiv (3.7.3+dfsg-5) ... 89s Setting up python3-all (3.13.7-1) ... 89s Setting up python3-pytz (2025.2-4) ... 89s Setting up python3-click (8.2.0+0.really.8.1.8-1) ... 89s Setting up python3-platformdirs (4.3.7-1) ... 89s Setting up python3-multidict (6.4.3-1build1) ... 89s Setting up python3-cloudpickle (3.1.1-1) ... 89s Setting up python3-frozenlist (1.8.0-1) ... 89s Setting up python3-aiosignal (1.4.0-1) ... 89s Setting up python3-async-timeout (5.0.1-1) ... 89s Setting up libblas3:amd64 (3.12.1-7) ... 89s update-alternatives: using /usr/lib/x86_64-linux-gnu/blas/libblas.so.3 to provide /usr/lib/x86_64-linux-gnu/libblas.so.3 (libblas.so.3-x86_64-linux-gnu) in auto mode 89s Setting up python3-numpy-dev:amd64 (1:2.2.4+ds-1ubuntu1) ... 89s Setting up python3-aiohappyeyeballs (2.6.1-2) ... 89s Setting up libgfortran5:amd64 (15.2.0-7ubuntu1) ... 89s Setting up python3-pluggy (1.6.0-1) ... 89s Setting up python3-propcache (0.3.1-1build1) ... 89s Setting up python3-toolz (1.0.0-2) ... 89s Setting up python3-msgpack (1.0.3-3build5) ... 90s Setting up libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... 90s Setting up python3-locket (1.0.0-2) ... 90s Setting up python3-yarl (1.22.0-1) ... 90s Setting up fonts-font-awesome (5.0.10+really4.7.0~dfsg-4.1) ... 90s Setting up sphinx-rtd-theme-common (3.0.2+dfsg-3) ... 90s Setting up libjs-bootstrap (3.4.1+dfsg-6) ... 90s Setting up libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... 90s Setting up python3-partd (1.4.2-1) ... 90s Setting up liblapack3:amd64 (3.12.1-7) ... 90s update-alternatives: using /usr/lib/x86_64-linux-gnu/lapack/liblapack.so.3 to provide /usr/lib/x86_64-linux-gnu/liblapack.so.3 (liblapack.so.3-x86_64-linux-gnu) in auto mode 90s Setting up python3-pytest (8.3.5-2) ... 90s Setting up python3-aiohttp (3.11.16-1) ... 90s Setting up python3-dask (2024.12.1+dfsg-2) ... 91s Setting up python3-numpy (1:2.2.4+ds-1ubuntu1) ... 92s Setting up libjs-sphinxdoc (8.2.3-1ubuntu2) ... 92s Setting up python3-intake (0.6.6-4) ... 92s Setting up python3-msgpack-numpy (0.4.8-1) ... 92s Setting up python3-pandas-lib:amd64 (2.3.3+dfsg-1ubuntu1) ... 92s Setting up python3-intake-doc (0.6.6-4) ... 92s Setting up python3-pandas (2.3.3+dfsg-1ubuntu1) ... 95s Processing triggers for man-db (2.13.1-1) ... 95s Processing triggers for libc-bin (2.42-2ubuntu2) ... 96s autopkgtest [16:50:07]: test run-unit-test: [----------------------- 96s ============================= test session starts ============================== 96s platform linux -- Python 3.13.9, pytest-8.3.5, pluggy-1.6.0 -- /usr/bin/python3.13 96s cachedir: .pytest_cache 96s rootdir: /tmp/autopkgtest.zAAzqt/build.B3R/src 96s plugins: typeguard-4.4.2 97s collecting ... collected 424 items / 11 skipped 97s 97s intake/auth/tests/test_auth.py::test_get PASSED [ 0%] 97s intake/auth/tests/test_auth.py::test_base PASSED [ 0%] 97s intake/auth/tests/test_auth.py::test_base_client PASSED [ 0%] 97s intake/auth/tests/test_auth.py::test_base_get_case_insensitive PASSED [ 0%] 97s intake/auth/tests/test_auth.py::test_secret PASSED [ 1%] 97s intake/auth/tests/test_auth.py::test_secret_client PASSED [ 1%] 97s intake/catalog/tests/test_alias.py::test_simple PASSED [ 1%] 97s intake/catalog/tests/test_alias.py::test_mapping PASSED [ 1%] 100s intake/catalog/tests/test_auth_integration.py::test_secret_auth PASSED [ 2%] 103s intake/catalog/tests/test_auth_integration.py::test_secret_auth_fail PASSED [ 2%] 103s intake/catalog/tests/test_caching_integration.py::test_load_csv PASSED [ 2%] 103s intake/catalog/tests/test_caching_integration.py::test_list_of_files PASSED [ 2%] 103s intake/catalog/tests/test_caching_integration.py::test_bad_type_cache PASSED [ 3%] 103s intake/catalog/tests/test_caching_integration.py::test_load_textfile FAILED [ 3%] 103s intake/catalog/tests/test_caching_integration.py::test_load_arr PASSED [ 3%] 103s intake/catalog/tests/test_caching_integration.py::test_regex[test_no_regex] PASSED [ 3%] 103s intake/catalog/tests/test_caching_integration.py::test_regex[test_regex_no_match] PASSED [ 4%] 103s intake/catalog/tests/test_caching_integration.py::test_regex[test_regex_partial_match] PASSED [ 4%] 103s intake/catalog/tests/test_caching_integration.py::test_get_metadata PASSED [ 4%] 103s intake/catalog/tests/test_caching_integration.py::test_clear_cache PASSED [ 4%] 103s intake/catalog/tests/test_caching_integration.py::test_clear_cache_bad_metadata PASSED [ 4%] 103s intake/catalog/tests/test_caching_integration.py::test_clear_all PASSED [ 5%] 103s intake/catalog/tests/test_caching_integration.py::test_second_load PASSED [ 5%] 103s intake/catalog/tests/test_caching_integration.py::test_second_load_timestamp PASSED [ 5%] 104s intake/catalog/tests/test_caching_integration.py::test_second_load_refresh PASSED [ 5%] 104s intake/catalog/tests/test_caching_integration.py::test_multiple_cache PASSED [ 6%] 104s intake/catalog/tests/test_caching_integration.py::test_disable_caching PASSED [ 6%] 104s intake/catalog/tests/test_caching_integration.py::test_ds_set_cache_dir PASSED [ 6%] 104s intake/catalog/tests/test_catalog_save.py::test_catalog_description PASSED [ 6%] 104s intake/catalog/tests/test_core.py::test_no_entry PASSED [ 7%] 104s intake/catalog/tests/test_core.py::test_regression PASSED [ 7%] 104s intake/catalog/tests/test_default.py::test_load PASSED [ 7%] 104s intake/catalog/tests/test_discovery.py::test_catalog_discovery PASSED [ 7%] 104s intake/catalog/tests/test_discovery.py::test_deferred_import PASSED [ 8%] 104s intake/catalog/tests/test_gui.py::test_cat_no_panel_does_not_raise_errors PASSED [ 8%] 104s intake/catalog/tests/test_gui.py::test_cat_no_panel_display_gui PASSED [ 8%] 104s intake/catalog/tests/test_gui.py::test_cat_gui SKIPPED (could not im...) [ 8%] 104s intake/catalog/tests/test_gui.py::test_entry_no_panel_does_not_raise_errors PASSED [ 8%] 104s intake/catalog/tests/test_gui.py::test_entry_no_panel_display_gui PASSED [ 9%] 104s intake/catalog/tests/test_gui.py::test_entry_gui SKIPPED (could not ...) [ 9%] 104s intake/catalog/tests/test_local.py::test_local_catalog PASSED [ 9%] 104s intake/catalog/tests/test_local.py::test_get_items PASSED [ 9%] 104s intake/catalog/tests/test_local.py::test_nested FAILED [ 10%] 104s intake/catalog/tests/test_local.py::test_nested_gets_name_from_super PASSED [ 10%] 104s intake/catalog/tests/test_local.py::test_hash PASSED [ 10%] 104s intake/catalog/tests/test_local.py::test_getitem PASSED [ 10%] 104s intake/catalog/tests/test_local.py::test_source_plugin_config PASSED [ 11%] 104s intake/catalog/tests/test_local.py::test_metadata PASSED [ 11%] 104s intake/catalog/tests/test_local.py::test_use_source_plugin_from_config PASSED [ 11%] 104s intake/catalog/tests/test_local.py::test_get_dir PASSED [ 11%] 104s intake/catalog/tests/test_local.py::test_entry_dir_function PASSED [ 12%] 104s intake/catalog/tests/test_local.py::test_user_parameter_default_value[bool-False] PASSED [ 12%] 104s intake/catalog/tests/test_local.py::test_user_parameter_default_value[datetime-expected1] PASSED [ 12%] 104s intake/catalog/tests/test_local.py::test_user_parameter_default_value[float-0.0] PASSED [ 12%] 104s intake/catalog/tests/test_local.py::test_user_parameter_default_value[int-0] PASSED [ 12%] 104s intake/catalog/tests/test_local.py::test_user_parameter_default_value[list-expected4] PASSED [ 13%] 104s intake/catalog/tests/test_local.py::test_user_parameter_default_value[str-] PASSED [ 13%] 104s intake/catalog/tests/test_local.py::test_user_parameter_default_value[unicode-] PASSED [ 13%] 104s intake/catalog/tests/test_local.py::test_user_parameter_repr PASSED [ 13%] 104s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[bool-true-True] PASSED [ 14%] 104s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[bool-0-False] PASSED [ 14%] 104s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[datetime-given2-expected2] PASSED [ 14%] 104s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[datetime-2018-01-01 12:34AM-expected3] PASSED [ 14%] 104s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[datetime-1234567890000000000-expected4] PASSED [ 15%] 104s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[float-3.14-3.14] PASSED [ 15%] 104s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[int-1-1] PASSED [ 15%] 104s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[list-given7-expected7] PASSED [ 15%] 104s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[str-1-1] PASSED [ 16%] 104s intake/catalog/tests/test_local.py::test_user_parameter_coerce_value[unicode-foo-foo] PASSED [ 16%] 104s intake/catalog/tests/test_local.py::test_user_parameter_coerce_special_datetime[now] PASSED [ 16%] 104s intake/catalog/tests/test_local.py::test_user_parameter_coerce_special_datetime[today] PASSED [ 16%] 104s intake/catalog/tests/test_local.py::test_user_parameter_coerce_min[float-100.0-100.0] PASSED [ 16%] 104s intake/catalog/tests/test_local.py::test_user_parameter_coerce_min[int-20-20] PASSED [ 17%] 104s intake/catalog/tests/test_local.py::test_user_parameter_coerce_min[int-20.0-20] PASSED [ 17%] 104s intake/catalog/tests/test_local.py::test_user_parameter_coerce_max[float-100.0-100.0] PASSED [ 17%] 104s intake/catalog/tests/test_local.py::test_user_parameter_coerce_max[int-20-20] PASSED [ 17%] 104s intake/catalog/tests/test_local.py::test_user_parameter_coerce_max[int-20.0-20] PASSED [ 18%] 104s intake/catalog/tests/test_local.py::test_user_parameter_coerce_allowed[float-given0-expected0] PASSED [ 18%] 104s intake/catalog/tests/test_local.py::test_user_parameter_coerce_allowed[int-given1-expected1] PASSED [ 18%] 104s intake/catalog/tests/test_local.py::test_user_parameter_validation_range PASSED [ 18%] 104s intake/catalog/tests/test_local.py::test_user_parameter_validation_allowed PASSED [ 19%] 104s intake/catalog/tests/test_local.py::test_user_pars_list PASSED [ 19%] 104s intake/catalog/tests/test_local.py::test_user_pars_mlist PASSED [ 19%] 104s intake/catalog/tests/test_local.py::test_parser_validation_error[catalog_non_dict] PASSED [ 19%] 104s intake/catalog/tests/test_local.py::test_parser_validation_error[data_source_missing] PASSED [ 20%] 104s intake/catalog/tests/test_local.py::test_parser_validation_error[data_source_name_non_string] PASSED [ 20%] 104s intake/catalog/tests/test_local.py::test_parser_validation_error[data_source_non_dict] PASSED [ 20%] 104s intake/catalog/tests/test_local.py::test_parser_validation_error[data_source_value_non_dict] PASSED [ 20%] 104s intake/catalog/tests/test_local.py::test_parser_validation_error[params_missing_required] PASSED [ 20%] 104s intake/catalog/tests/test_local.py::test_parser_validation_error[params_name_non_string] PASSED [ 21%] 104s intake/catalog/tests/test_local.py::test_parser_validation_error[params_non_dict] PASSED [ 21%] 104s intake/catalog/tests/test_local.py::test_parser_validation_error[params_value_bad_choice] PASSED [ 21%] 104s intake/catalog/tests/test_local.py::test_parser_validation_error[params_value_bad_type] PASSED [ 21%] 104s intake/catalog/tests/test_local.py::test_parser_validation_error[params_value_non_dict] PASSED [ 22%] 104s intake/catalog/tests/test_local.py::test_parser_validation_error[plugins_non_dict] PASSED [ 22%] 104s intake/catalog/tests/test_local.py::test_parser_validation_error[plugins_source_missing] PASSED [ 22%] 104s intake/catalog/tests/test_local.py::test_parser_validation_error[plugins_source_missing_key] PASSED [ 22%] 104s intake/catalog/tests/test_local.py::test_parser_validation_error[plugins_source_non_dict] PASSED [ 23%] 104s intake/catalog/tests/test_local.py::test_parser_validation_error[plugins_source_non_list] PASSED [ 23%] 104s intake/catalog/tests/test_local.py::test_parser_obsolete_error[obsolete_data_source_list] PASSED [ 23%] 104s intake/catalog/tests/test_local.py::test_parser_obsolete_error[obsolete_params_list] PASSED [ 23%] 104s intake/catalog/tests/test_local.py::test_union_catalog PASSED [ 24%] 104s intake/catalog/tests/test_local.py::test_persist_local_cat PASSED [ 24%] 104s intake/catalog/tests/test_local.py::test_empty_catalog PASSED [ 24%] 104s intake/catalog/tests/test_local.py::test_nonexistent_error PASSED [ 24%] 104s intake/catalog/tests/test_local.py::test_duplicate_data_sources PASSED [ 25%] 104s intake/catalog/tests/test_local.py::test_duplicate_parameters PASSED [ 25%] 104s intake/catalog/tests/test_local.py::test_catalog_file_removal PASSED [ 25%] 104s intake/catalog/tests/test_local.py::test_flatten_duplicate_error PASSED [ 25%] 104s intake/catalog/tests/test_local.py::test_multi_cat_names PASSED [ 25%] 104s intake/catalog/tests/test_local.py::test_name_of_builtin PASSED [ 26%] 104s intake/catalog/tests/test_local.py::test_cat_with_declared_name PASSED [ 26%] 104s intake/catalog/tests/test_local.py::test_cat_with_no_declared_name_gets_name_from_dir_if_file_named_catalog PASSED [ 26%] 104s intake/catalog/tests/test_local.py::test_default_expansions PASSED [ 26%] 104s intake/catalog/tests/test_local.py::test_remote_cat PASSED [ 27%] 104s intake/catalog/tests/test_local.py::test_multi_plugins PASSED [ 27%] 104s intake/catalog/tests/test_local.py::test_no_plugins PASSED [ 27%] 104s intake/catalog/tests/test_local.py::test_explicit_entry_driver PASSED [ 27%] 104s intake/catalog/tests/test_local.py::test_getitem_and_getattr PASSED [ 28%] 104s intake/catalog/tests/test_local.py::test_dot_names PASSED [ 28%] 104s intake/catalog/tests/test_local.py::test_listing PASSED [ 28%] 105s intake/catalog/tests/test_local.py::test_dict_save PASSED [ 28%] 105s intake/catalog/tests/test_local.py::test_dict_save_complex PASSED [ 29%] 105s intake/catalog/tests/test_local.py::test_dict_adddel PASSED [ 29%] 105s intake/catalog/tests/test_local.py::test_filter PASSED [ 29%] 105s intake/catalog/tests/test_local.py::test_from_dict_with_data_source PASSED [ 29%] 105s intake/catalog/tests/test_local.py::test_no_instance PASSED [ 29%] 105s intake/catalog/tests/test_local.py::test_fsspec_integration PASSED [ 30%] 105s intake/catalog/tests/test_local.py::test_cat_add PASSED [ 30%] 105s intake/catalog/tests/test_local.py::test_no_entries_items PASSED [ 30%] 105s intake/catalog/tests/test_local.py::test_cat_dictlike PASSED [ 30%] 105s intake/catalog/tests/test_local.py::test_inherit_params SKIPPED (tes...) [ 31%] 105s intake/catalog/tests/test_local.py::test_runtime_overwrite_params SKIPPED [ 31%] 105s intake/catalog/tests/test_local.py::test_local_param_overwrites SKIPPED [ 31%] 105s intake/catalog/tests/test_local.py::test_local_and_global_params SKIPPED [ 31%] 105s intake/catalog/tests/test_local.py::test_search_inherit_params SKIPPED [ 32%] 105s intake/catalog/tests/test_local.py::test_multiple_cats_params SKIPPED [ 32%] 105s intake/catalog/tests/test_parameters.py::test_simplest PASSED [ 32%] 105s intake/catalog/tests/test_parameters.py::test_cache_default_source PASSED [ 32%] 105s intake/catalog/tests/test_parameters.py::test_parameter_default PASSED [ 33%] 105s intake/catalog/tests/test_parameters.py::test_maybe_default_from_env PASSED [ 33%] 105s intake/catalog/tests/test_parameters.py::test_up_override_and_render PASSED [ 33%] 105s intake/catalog/tests/test_parameters.py::test_user_explicit_override PASSED [ 33%] 105s intake/catalog/tests/test_parameters.py::test_auto_env_expansion PASSED [ 33%] 105s intake/catalog/tests/test_parameters.py::test_validate_up PASSED [ 34%] 105s intake/catalog/tests/test_parameters.py::test_validate_par PASSED [ 34%] 105s intake/catalog/tests/test_parameters.py::test_mlist_parameter PASSED [ 34%] 105s intake/catalog/tests/test_parameters.py::test_explicit_overrides PASSED [ 34%] 105s intake/catalog/tests/test_parameters.py::test_extra_arg PASSED [ 35%] 105s intake/catalog/tests/test_parameters.py::test_unknown PASSED [ 35%] 105s intake/catalog/tests/test_parameters.py::test_catalog_passthrough PASSED [ 35%] 105s intake/catalog/tests/test_persist.py::test_idempotent SKIPPED (could...) [ 35%] 105s intake/catalog/tests/test_persist.py::test_parquet SKIPPED (could no...) [ 36%] 107s intake/catalog/tests/test_reload_integration.py::test_reload_updated_config PASSED [ 36%] 109s intake/catalog/tests/test_reload_integration.py::test_reload_updated_directory PASSED [ 36%] 111s intake/catalog/tests/test_reload_integration.py::test_reload_missing_remote_directory PASSED [ 36%] 114s intake/catalog/tests/test_reload_integration.py::test_reload_missing_local_directory PASSED [ 37%] 114s intake/catalog/tests/test_remote_integration.py::test_info_describe FAILED [ 37%] 114s intake/catalog/tests/test_remote_integration.py::test_bad_url PASSED [ 37%] 114s intake/catalog/tests/test_remote_integration.py::test_metadata PASSED [ 37%] 114s intake/catalog/tests/test_remote_integration.py::test_nested_remote PASSED [ 37%] 114s intake/catalog/tests/test_remote_integration.py::test_remote_direct FAILED [ 38%] 114s intake/catalog/tests/test_remote_integration.py::test_entry_metadata PASSED [ 38%] 114s intake/catalog/tests/test_remote_integration.py::test_unknown_source PASSED [ 38%] 114s intake/catalog/tests/test_remote_integration.py::test_remote_datasource_interface FAILED [ 38%] 114s intake/catalog/tests/test_remote_integration.py::test_environment_evaluation PASSED [ 39%] 114s intake/catalog/tests/test_remote_integration.py::test_read FAILED [ 39%] 114s intake/catalog/tests/test_remote_integration.py::test_read_direct PASSED [ 39%] 114s intake/catalog/tests/test_remote_integration.py::test_read_chunks FAILED [ 39%] 114s intake/catalog/tests/test_remote_integration.py::test_read_partition FAILED [ 40%] 114s intake/catalog/tests/test_remote_integration.py::test_close FAILED [ 40%] 114s intake/catalog/tests/test_remote_integration.py::test_with FAILED [ 40%] 114s intake/catalog/tests/test_remote_integration.py::test_pickle FAILED [ 40%] 114s intake/catalog/tests/test_remote_integration.py::test_to_dask FAILED [ 41%] 115s intake/catalog/tests/test_remote_integration.py::test_remote_env PASSED [ 41%] 115s intake/catalog/tests/test_remote_integration.py::test_remote_sequence FAILED [ 41%] 115s intake/catalog/tests/test_remote_integration.py::test_remote_arr PASSED [ 41%] 115s intake/catalog/tests/test_remote_integration.py::test_pagination PASSED [ 41%] 115s intake/catalog/tests/test_remote_integration.py::test_dir FAILED [ 42%] 115s intake/catalog/tests/test_remote_integration.py::test_getitem_and_getattr PASSED [ 42%] 115s intake/catalog/tests/test_remote_integration.py::test_search PASSED [ 42%] 115s intake/catalog/tests/test_remote_integration.py::test_access_subcatalog PASSED [ 42%] 115s intake/catalog/tests/test_remote_integration.py::test_len PASSED [ 43%] 116s intake/catalog/tests/test_remote_integration.py::test_datetime PASSED [ 43%] 116s intake/catalog/tests/test_utils.py::test_expand_templates PASSED [ 43%] 116s intake/catalog/tests/test_utils.py::test_expand_nested_template PASSED [ 43%] 116s intake/catalog/tests/test_utils.py::test_coerce_datetime[None-expected0] PASSED [ 44%] 116s intake/catalog/tests/test_utils.py::test_coerce_datetime[1-expected1] PASSED [ 44%] 116s intake/catalog/tests/test_utils.py::test_coerce_datetime[1988-02-24T13:37+0100-expected2] PASSED [ 44%] 116s intake/catalog/tests/test_utils.py::test_coerce_datetime[test_input3-expected3] PASSED [ 44%] 116s intake/catalog/tests/test_utils.py::test_flatten PASSED [ 45%] 116s intake/catalog/tests/test_utils.py::test_coerce[1-int-1_0] PASSED [ 45%] 116s intake/catalog/tests/test_utils.py::test_coerce[1-int-1_1] PASSED [ 45%] 116s intake/catalog/tests/test_utils.py::test_coerce[1-str-1] PASSED [ 45%] 116s intake/catalog/tests/test_utils.py::test_coerce[value3-list-expected3] PASSED [ 45%] 116s intake/catalog/tests/test_utils.py::test_coerce[value4-list-expected4] PASSED [ 46%] 116s intake/catalog/tests/test_utils.py::test_coerce[value5-list[str]-expected5] PASSED [ 46%] 116s intake/cli/client/tests/test_cache.py::test_help PASSED [ 46%] 116s intake/cli/client/tests/test_cache.py::test_list_keys PASSED [ 46%] 117s intake/cli/client/tests/test_cache.py::test_precache PASSED [ 47%] 117s intake/cli/client/tests/test_cache.py::test_clear_all PASSED [ 47%] 117s intake/cli/client/tests/test_cache.py::test_clear_one PASSED [ 47%] 117s intake/cli/client/tests/test_cache.py::test_usage PASSED [ 47%] 118s intake/cli/client/tests/test_conf.py::test_reset PASSED [ 48%] 118s intake/cli/client/tests/test_conf.py::test_info PASSED [ 48%] 118s intake/cli/client/tests/test_conf.py::test_defaults PASSED [ 48%] 118s intake/cli/client/tests/test_conf.py::test_get PASSED [ 48%] 118s intake/cli/client/tests/test_conf.py::test_log_level PASSED [ 49%] 118s intake/cli/client/tests/test_local_integration.py::test_list PASSED [ 49%] 119s intake/cli/client/tests/test_local_integration.py::test_full_list PASSED [ 49%] 119s intake/cli/client/tests/test_local_integration.py::test_describe PASSED [ 49%] 119s intake/cli/client/tests/test_local_integration.py::test_exists_pass PASSED [ 50%] 119s intake/cli/client/tests/test_local_integration.py::test_exists_fail PASSED [ 50%] 120s intake/cli/client/tests/test_local_integration.py::test_discover FAILED [ 50%] 120s intake/cli/client/tests/test_local_integration.py::test_get_pass FAILED [ 50%] 120s intake/cli/client/tests/test_local_integration.py::test_get_fail PASSED [ 50%] 120s intake/cli/client/tests/test_local_integration.py::test_example PASSED [ 51%] 120s intake/cli/server/tests/test_serializer.py::test_dataframe[ser0] SKIPPED [ 51%] 120s intake/cli/server/tests/test_serializer.py::test_dataframe[ser1] SKIPPED [ 51%] 120s intake/cli/server/tests/test_serializer.py::test_dataframe[ser2] SKIPPED [ 51%] 120s intake/cli/server/tests/test_serializer.py::test_ndarray[ser0] PASSED [ 52%] 120s intake/cli/server/tests/test_serializer.py::test_ndarray[ser1] PASSED [ 52%] 120s intake/cli/server/tests/test_serializer.py::test_ndarray[ser2] PASSED [ 52%] 120s intake/cli/server/tests/test_serializer.py::test_python[ser0] PASSED [ 52%] 120s intake/cli/server/tests/test_serializer.py::test_python[ser1] PASSED [ 53%] 120s intake/cli/server/tests/test_serializer.py::test_python[ser2] PASSED [ 53%] 120s intake/cli/server/tests/test_serializer.py::test_compression_roundtrip[comp0] PASSED [ 53%] 120s intake/cli/server/tests/test_serializer.py::test_compression_roundtrip[comp1] PASSED [ 53%] 120s intake/cli/server/tests/test_serializer.py::test_none_compress PASSED [ 54%] 120s intake/cli/server/tests/test_server.py::TestServerV1Info::test_info PASSED [ 54%] 120s intake/cli/server/tests/test_server.py::TestServerV1Source::test_bad_action PASSED [ 54%] 120s intake/cli/server/tests/test_server.py::TestServerV1Source::test_idle_timer FAILED [ 54%] 120s intake/cli/server/tests/test_server.py::TestServerV1Source::test_no_format FAILED [ 54%] 120s intake/cli/server/tests/test_server.py::TestServerV1Source::test_open FAILED [ 55%] 120s intake/cli/server/tests/test_server.py::TestServerV1Source::test_open_direct PASSED [ 55%] 120s intake/cli/server/tests/test_server.py::TestServerV1Source::test_read_part_compressed SKIPPED [ 55%] 120s intake/cli/server/tests/test_server.py::TestServerV1Source::test_read_partition SKIPPED [ 55%] 121s intake/cli/server/tests/test_server.py::test_flatten_flag PASSED [ 56%] 121s intake/cli/server/tests/test_server.py::test_port_flag PASSED [ 56%] 121s intake/cli/tests/test_util.py::test_print_entry_info PASSED [ 56%] 121s intake/cli/tests/test_util.py::test_die PASSED [ 56%] 121s intake/cli/tests/test_util.py::Test_nice_join::test_default PASSED [ 57%] 121s intake/cli/tests/test_util.py::Test_nice_join::test_string_conjunction PASSED [ 57%] 121s intake/cli/tests/test_util.py::Test_nice_join::test_None_conjunction PASSED [ 57%] 121s intake/cli/tests/test_util.py::Test_nice_join::test_sep PASSED [ 57%] 121s intake/cli/tests/test_util.py::TestSubcommand::test_initialize_abstract PASSED [ 58%] 121s intake/cli/tests/test_util.py::TestSubcommand::test_invoke_abstract PASSED [ 58%] 121s intake/container/tests/test_generics.py::test_generic_dataframe PASSED [ 58%] 122s intake/container/tests/test_persist.py::test_store PASSED [ 58%] 122s intake/container/tests/test_persist.py::test_backtrack PASSED [ 58%] 122s intake/container/tests/test_persist.py::test_persist_with_nonnumeric_ttl_raises_error PASSED [ 59%] 122s intake/container/tests/test_persist.py::test_undask_persist SKIPPED [ 59%] 122s intake/interface/tests/test_init_gui.py::test_no_panel_does_not_raise_errors ERROR [ 59%] 122s intake/interface/tests/test_init_gui.py::test_no_panel_display_init_gui ERROR [ 59%] 122s intake/interface/tests/test_init_gui.py::test_display_init_gui ERROR [ 60%] 122s intake/source/tests/test_base.py::test_datasource_base_method_exceptions PASSED [ 60%] 122s intake/source/tests/test_base.py::test_name PASSED [ 60%] 122s intake/source/tests/test_base.py::test_datasource_base_context_manager PASSED [ 60%] 122s intake/source/tests/test_base.py::test_datasource_discover PASSED [ 61%] 122s intake/source/tests/test_base.py::test_datasource_read PASSED [ 61%] 122s intake/source/tests/test_base.py::test_datasource_read_chunked PASSED [ 61%] 122s intake/source/tests/test_base.py::test_datasource_read_partition PASSED [ 61%] 122s intake/source/tests/test_base.py::test_datasource_read_partition_out_of_range PASSED [ 62%] 122s intake/source/tests/test_base.py::test_datasource_to_dask PASSED [ 62%] 122s intake/source/tests/test_base.py::test_datasource_close PASSED [ 62%] 122s intake/source/tests/test_base.py::test_datasource_context_manager PASSED [ 62%] 122s intake/source/tests/test_base.py::test_datasource_pickle PASSED [ 62%] 122s intake/source/tests/test_base.py::test_datasource_python_discover PASSED [ 63%] 122s intake/source/tests/test_base.py::test_datasource_python_read PASSED [ 63%] 122s intake/source/tests/test_base.py::test_datasource_python_to_dask PASSED [ 63%] 122s intake/source/tests/test_base.py::test_yaml_method PASSED [ 63%] 122s intake/source/tests/test_base.py::test_alias_fail PASSED [ 64%] 122s intake/source/tests/test_base.py::test_reconfigure PASSED [ 64%] 122s intake/source/tests/test_base.py::test_import_name[data0] PASSED [ 64%] 122s intake/source/tests/test_base.py::test_import_name[data1] PASSED [ 64%] 122s intake/source/tests/test_base.py::test_import_name[data2] PASSED [ 65%] 122s intake/source/tests/test_base.py::test_import_name[data3] PASSED [ 65%] 122s intake/source/tests/test_base.py::test_import_name[data4] PASSED [ 65%] 122s intake/source/tests/test_cache.py::test_ensure_cache_dir PASSED [ 65%] 122s intake/source/tests/test_cache.py::test_munge_path PASSED [ 66%] 122s intake/source/tests/test_cache.py::test_hash PASSED [ 66%] 122s intake/source/tests/test_cache.py::test_path PASSED [ 66%] 122s intake/source/tests/test_cache.py::test_dir_cache PASSED [ 66%] 122s intake/source/tests/test_cache.py::test_compressed_cache PASSED [ 66%] 122s intake/source/tests/test_cache.py::test_filtered_compressed_cache PASSED [ 67%] 122s intake/source/tests/test_cache.py::test_cache_to_cat PASSED [ 67%] 122s intake/source/tests/test_cache.py::test_compressed_cache_infer PASSED [ 67%] 122s intake/source/tests/test_cache.py::test_compressions[tgz] PASSED [ 67%] 122s intake/source/tests/test_cache.py::test_compressions[tbz] PASSED [ 68%] 122s intake/source/tests/test_cache.py::test_compressions[tar] PASSED [ 68%] 122s intake/source/tests/test_cache.py::test_compressions[gz] PASSED [ 68%] 122s intake/source/tests/test_cache.py::test_compressions[bz] PASSED [ 68%] 122s intake/source/tests/test_cache.py::test_compressed_cache_bad PASSED [ 69%] 122s intake/source/tests/test_cache.py::test_dat SKIPPED (DAT not avaiable) [ 69%] 122s intake/source/tests/test_csv.py::test_csv_plugin PASSED [ 69%] 122s intake/source/tests/test_csv.py::test_open PASSED [ 69%] 122s intake/source/tests/test_csv.py::test_discover PASSED [ 70%] 122s intake/source/tests/test_csv.py::test_read PASSED [ 70%] 122s intake/source/tests/test_csv.py::test_read_list PASSED [ 70%] 122s intake/source/tests/test_csv.py::test_read_chunked PASSED [ 70%] 122s intake/source/tests/test_csv.py::test_read_pattern PASSED [ 70%] 123s intake/source/tests/test_csv.py::test_read_pattern_with_cache PASSED [ 71%] 123s intake/source/tests/test_csv.py::test_read_pattern_with_path_as_pattern_str PASSED [ 71%] 123s intake/source/tests/test_csv.py::test_read_partition PASSED [ 71%] 123s intake/source/tests/test_csv.py::test_to_dask PASSED [ 71%] 123s intake/source/tests/test_csv.py::test_plot SKIPPED (could not import...) [ 72%] 123s intake/source/tests/test_csv.py::test_close PASSED [ 72%] 123s intake/source/tests/test_csv.py::test_pickle PASSED [ 72%] 123s intake/source/tests/test_derived.py::test_columns PASSED [ 72%] 123s intake/source/tests/test_derived.py::test_df_transform PASSED [ 73%] 123s intake/source/tests/test_derived.py::test_barebones PASSED [ 73%] 123s intake/source/tests/test_derived.py::test_other_cat FAILED [ 73%] 123s intake/source/tests/test_discovery.py::test_package_scan PASSED [ 73%] 123s intake/source/tests/test_discovery.py::test_discover_cli PASSED [ 74%] 123s intake/source/tests/test_discovery.py::test_discover PASSED [ 74%] 123s intake/source/tests/test_discovery.py::test_enable_and_disable PASSED [ 74%] 123s intake/source/tests/test_discovery.py::test_discover_collision PASSED [ 74%] 123s intake/source/tests/test_json.py::test_jsonfile[None] PASSED [ 75%] 123s intake/source/tests/test_json.py::test_jsonfile[gzip] PASSED [ 75%] 123s intake/source/tests/test_json.py::test_jsonfile[bz2] PASSED [ 75%] 123s intake/source/tests/test_json.py::test_jsonfile_none[None] PASSED [ 75%] 123s intake/source/tests/test_json.py::test_jsonfile_none[gzip] PASSED [ 75%] 123s intake/source/tests/test_json.py::test_jsonfile_none[bz2] PASSED [ 76%] 123s intake/source/tests/test_json.py::test_jsonfile_discover[None] PASSED [ 76%] 123s intake/source/tests/test_json.py::test_jsonfile_discover[gzip] PASSED [ 76%] 123s intake/source/tests/test_json.py::test_jsonfile_discover[bz2] PASSED [ 76%] 123s intake/source/tests/test_json.py::test_jsonlfile[None] PASSED [ 77%] 123s intake/source/tests/test_json.py::test_jsonlfile[gzip] PASSED [ 77%] 123s intake/source/tests/test_json.py::test_jsonlfile[bz2] PASSED [ 77%] 123s intake/source/tests/test_json.py::test_jsonfilel_none[None] PASSED [ 77%] 123s intake/source/tests/test_json.py::test_jsonfilel_none[gzip] PASSED [ 78%] 123s intake/source/tests/test_json.py::test_jsonfilel_none[bz2] PASSED [ 78%] 123s intake/source/tests/test_json.py::test_jsonfilel_discover[None] PASSED [ 78%] 123s intake/source/tests/test_json.py::test_jsonfilel_discover[gzip] PASSED [ 78%] 123s intake/source/tests/test_json.py::test_jsonfilel_discover[bz2] PASSED [ 79%] 123s intake/source/tests/test_json.py::test_jsonl_head[None] PASSED [ 79%] 123s intake/source/tests/test_json.py::test_jsonl_head[gzip] PASSED [ 79%] 123s intake/source/tests/test_json.py::test_jsonl_head[bz2] PASSED [ 79%] 123s intake/source/tests/test_npy.py::test_one_file[shape0] PASSED [ 79%] 123s intake/source/tests/test_npy.py::test_one_file[shape1] PASSED [ 80%] 123s intake/source/tests/test_npy.py::test_one_file[shape2] PASSED [ 80%] 123s intake/source/tests/test_npy.py::test_one_file[shape3] PASSED [ 80%] 123s intake/source/tests/test_npy.py::test_one_file[shape4] PASSED [ 80%] 123s intake/source/tests/test_npy.py::test_multi_file[shape0] PASSED [ 81%] 123s intake/source/tests/test_npy.py::test_multi_file[shape1] PASSED [ 81%] 123s intake/source/tests/test_npy.py::test_multi_file[shape2] PASSED [ 81%] 123s intake/source/tests/test_npy.py::test_multi_file[shape3] PASSED [ 81%] 123s intake/source/tests/test_npy.py::test_multi_file[shape4] PASSED [ 82%] 123s intake/source/tests/test_npy.py::test_zarr_minimal SKIPPED (could no...) [ 82%] 123s intake/source/tests/test_text.py::test_textfiles PASSED [ 82%] 123s intake/source/tests/test_text.py::test_complex_text[None] PASSED [ 82%] 123s intake/source/tests/test_text.py::test_complex_text[gzip] PASSED [ 83%] 124s intake/source/tests/test_text.py::test_complex_text[bz2] PASSED [ 83%] 124s intake/source/tests/test_text.py::test_complex_bytes[pars0-None] PASSED [ 83%] 124s intake/source/tests/test_text.py::test_complex_bytes[pars0-gzip] PASSED [ 83%] 124s intake/source/tests/test_text.py::test_complex_bytes[pars0-bz2] PASSED [ 83%] 124s intake/source/tests/test_text.py::test_complex_bytes[pars1-None] PASSED [ 84%] 124s intake/source/tests/test_text.py::test_complex_bytes[pars1-gzip] PASSED [ 84%] 124s intake/source/tests/test_text.py::test_complex_bytes[pars1-bz2] PASSED [ 84%] 125s intake/source/tests/test_text.py::test_complex_bytes[pars2-None] PASSED [ 84%] 125s intake/source/tests/test_text.py::test_complex_bytes[pars2-gzip] PASSED [ 85%] 125s intake/source/tests/test_text.py::test_complex_bytes[pars2-bz2] PASSED [ 85%] 125s intake/source/tests/test_text.py::test_complex_bytes[pars3-None] PASSED [ 85%] 125s intake/source/tests/test_text.py::test_complex_bytes[pars3-gzip] PASSED [ 85%] 125s intake/source/tests/test_text.py::test_complex_bytes[pars3-bz2] PASSED [ 86%] 125s intake/source/tests/test_text.py::test_text_persist FAILED [ 86%] 125s intake/source/tests/test_text.py::test_text_export FAILED [ 86%] 125s intake/source/tests/test_utils.py::test_path_to_glob[data/LT05_L1TP_042033_{start_date:%Y%m%d}_{end_date:%Y%m%d}_01_T1_sr_band{band:1d}.tif-data/LT05_L1TP_042033_*_*_01_T1_sr_band*.tif] PASSED [ 86%] 125s intake/source/tests/test_utils.py::test_path_to_glob[data/LT05_L1TP_042033_*_*_01_T1_sr_band*.tif-data/LT05_L1TP_042033_*_*_01_T1_sr_band*.tif] PASSED [ 87%] 125s intake/source/tests/test_utils.py::test_path_to_glob[{year}/{month}/{day}.csv-*/*/*.csv] PASSED [ 87%] 125s intake/source/tests/test_utils.py::test_path_to_glob[data/**/*.csv-data/**/*.csv] PASSED [ 87%] 125s intake/source/tests/test_utils.py::test_path_to_glob[data/{year:4}{month:02}{day:02}.csv-data/*.csv] PASSED [ 87%] 125s intake/source/tests/test_utils.py::test_path_to_glob[{lone_param}-*] PASSED [ 87%] 125s intake/source/tests/test_utils.py::test_reverse_format[*.csv-apple.csv-expected0] PASSED [ 88%] 125s intake/source/tests/test_utils.py::test_reverse_format[{}.csv-apple.csv-expected1] PASSED [ 88%] 125s intake/source/tests/test_utils.py::test_reverse_format[{fruit}.{}-apple.csv-expected2] PASSED [ 88%] 125s intake/source/tests/test_utils.py::test_reverse_format[data//{fruit}.csv-data/apple.csv-expected3] PASSED [ 88%] 125s intake/source/tests/test_utils.py::test_reverse_format[data\\{fruit}.csv-C:\\data\\apple.csv-expected4] PASSED [ 89%] 125s intake/source/tests/test_utils.py::test_reverse_format[data/{fruit}.csv-C:\\data\\apple.csv-expected5] PASSED [ 89%] 125s intake/source/tests/test_utils.py::test_reverse_format[data/{fruit}.csv-data//apple.csv-expected6] PASSED [ 89%] 125s intake/source/tests/test_utils.py::test_reverse_format[{num:d}.csv-k.csv-expected7] PASSED [ 89%] 125s intake/source/tests/test_utils.py::test_reverse_format[{year:d}/{month:d}/{day:d}.csv-2016/2/01.csv-expected8] PASSED [ 90%] 125s intake/source/tests/test_utils.py::test_reverse_format[{year:.4}/{month:.2}/{day:.2}.csv-2016/2/01.csv-expected9] PASSED [ 90%] 125s intake/source/tests/test_utils.py::test_reverse_format[SRLCCTabularDat/Ecoregions_{emissions}_Precip_{model}.csv-/user/examples/SRLCCTabularDat/Ecoregions_a1b_Precip_ECHAM5-MPI.csv-expected10] PASSED [ 90%] 125s intake/source/tests/test_utils.py::test_reverse_format[data_{date:%Y_%m_%d}.csv-data_2016_10_01.csv-expected11] PASSED [ 90%] 125s intake/source/tests/test_utils.py::test_reverse_format[{state}{zip:5}-PA19104-expected12] PASSED [ 91%] 125s intake/source/tests/test_utils.py::test_reverse_format[{state}{zip:5d}.csv-PA19104.csv-expected13] PASSED [ 91%] 125s intake/source/tests/test_utils.py::test_reverse_format[{state:2}{zip:d}.csv-PA19104.csv-expected14] PASSED [ 91%] 125s intake/source/tests/test_utils.py::test_roundtripping_reverse_format[{date:%Y%m%d}-expected0] PASSED [ 91%] 125s intake/source/tests/test_utils.py::test_roundtripping_reverse_format[{num: .2f}-expected1] PASSED [ 91%] 125s intake/source/tests/test_utils.py::test_roundtripping_reverse_format[{percentage:.2%}-expected2] PASSED [ 92%] 125s intake/source/tests/test_utils.py::test_roundtripping_reverse_format[data/{year:4d}{month:02d}{day:02d}.csv-expected3] PASSED [ 92%] 125s intake/source/tests/test_utils.py::test_reverse_format_errors PASSED [ 92%] 125s intake/source/tests/test_utils.py::test_roundtrip_reverse_formats[data_{year}_{month}_{day}.csv] PASSED [ 92%] 125s intake/source/tests/test_utils.py::test_roundtrip_reverse_formats[data_{year:d}_{month:02d}_{day:02d}.csv] PASSED [ 93%] 125s intake/source/tests/test_utils.py::test_roundtrip_reverse_formats[data_{date:%Y_%m_%d}.csv] PASSED [ 93%] 125s intake/source/tests/test_utils.py::test_path_to_pattern[http://data/band{band:1d}.tif-metadata0-/band{band:1d}.tif] PASSED [ 93%] 125s intake/source/tests/test_utils.py::test_path_to_pattern[/data/band{band:1d}.tif-metadata1-/data/band{band:1d}.tif] PASSED [ 93%] 125s intake/source/tests/test_utils.py::test_path_to_pattern[/data/band{band:1d}.tif-None-/data/band{band:1d}.tif] PASSED [ 94%] 125s intake/tests/test_config.py::test_load_conf[conf0] PASSED [ 94%] 125s intake/tests/test_config.py::test_load_conf[conf1] PASSED [ 94%] 125s intake/tests/test_config.py::test_load_conf[conf2] PASSED [ 94%] 126s intake/tests/test_config.py::test_basic PASSED [ 95%] 126s intake/tests/test_config.py::test_cli PASSED [ 95%] 126s intake/tests/test_config.py::test_persist_modes PASSED [ 95%] 127s intake/tests/test_config.py::test_conf PASSED [ 95%] 127s intake/tests/test_config.py::test_conf_auth PASSED [ 95%] 127s intake/tests/test_config.py::test_pathdirs PASSED [ 96%] 127s intake/tests/test_top_level.py::test_autoregister_open PASSED [ 96%] 127s intake/tests/test_top_level.py::test_default_catalogs PASSED [ 96%] 127s intake/tests/test_top_level.py::test_user_catalog PASSED [ 96%] 127s intake/tests/test_top_level.py::test_open_styles PASSED [ 97%] 129s intake/tests/test_top_level.py::test_path_catalog PASSED [ 97%] 129s intake/tests/test_top_level.py::test_bad_open PASSED [ 97%] 129s intake/tests/test_top_level.py::test_output_notebook SKIPPED (could ...) [ 97%] 129s intake/tests/test_top_level.py::test_old_usage PASSED [ 98%] 129s intake/tests/test_top_level.py::test_no_imports PASSED [ 98%] 129s intake/tests/test_top_level.py::test_nested_catalog_access PASSED [ 98%] 129s intake/tests/test_utils.py::test_windows_file_path PASSED [ 98%] 129s intake/tests/test_utils.py::test_make_path_posix_removes_double_sep PASSED [ 99%] 129s intake/tests/test_utils.py::test_noops[~/fake.file] PASSED [ 99%] 129s intake/tests/test_utils.py::test_noops[https://example.com] PASSED [ 99%] 129s intake/tests/test_utils.py::test_roundtrip_file_path PASSED [ 99%] 129s intake/tests/test_utils.py::test_yaml_tuples PASSED [100%] 129s 129s ==================================== ERRORS ==================================== 129s ____________ ERROR at setup of test_no_panel_does_not_raise_errors _____________ 129s 129s attr = 'pytest_plugins' 129s 129s def __getattr__(attr): 129s if attr == 'instance': 129s do_import() 129s > return gl['instance'] 129s E KeyError: 'instance' 129s 129s intake/interface/__init__.py:39: KeyError 129s _______________ ERROR at setup of test_no_panel_display_init_gui _______________ 129s 129s attr = 'pytest_plugins' 129s 129s def __getattr__(attr): 129s if attr == 'instance': 129s do_import() 129s > return gl['instance'] 129s E KeyError: 'instance' 129s 129s intake/interface/__init__.py:39: KeyError 129s ___________________ ERROR at setup of test_display_init_gui ____________________ 129s 129s attr = 'pytest_plugins' 129s 129s def __getattr__(attr): 129s if attr == 'instance': 129s do_import() 129s > return gl['instance'] 129s E KeyError: 'instance' 129s 129s intake/interface/__init__.py:39: KeyError 129s =================================== FAILURES =================================== 129s ______________________________ test_load_textfile ______________________________ 129s 129s catalog_cache = 129s 129s def test_load_textfile(catalog_cache): 129s cat = catalog_cache['text_cache'] 129s cache = cat.cache[0] 129s 129s cache_paths = cache.load(cat._urlpath, output=False) 129s > cache_path = cache_paths[-1] 129s E TypeError: 'NoneType' object is not subscriptable 129s 129s intake/catalog/tests/test_caching_integration.py:53: TypeError 129s _________________________________ test_nested __________________________________ 129s 129s args = ('/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv',) 129s kwargs = {'storage_options': None} 129s func = .read at 0x7c2b51791080> 129s exc = OSError('An error occurred while calling the read_csv method registered to the pandas backend.\nOriginal Message: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files') 129s 129s @wraps(fn) 129s def wrapper(*args, **kwargs): 129s func = getattr(self, dispatch_name) 129s try: 129s > return func(*args, **kwargs) 129s 129s /usr/lib/python3/dist-packages/dask/backends.py:140: 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s /usr/lib/python3/dist-packages/dask/dataframe/io/csv.py:877: in read 129s return read_pandas( 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s 129s reader = 129s urlpath = '/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv' 129s blocksize = 'default', lineterminator = '\n', compression = 'infer' 129s sample = 256000, sample_rows = 10, enforce = False, assume_missing = False 129s storage_options = None, include_path_column = False, kwargs = {} 129s reader_name = 'read_csv', b_lineterminator = b'\n', kw = 'chunksize' 129s lastskiprow = 0, firstrow = 0 129s 129s def read_pandas( 129s reader, 129s urlpath, 129s blocksize="default", 129s lineterminator=None, 129s compression="infer", 129s sample=256000, 129s sample_rows=10, 129s enforce=False, 129s assume_missing=False, 129s storage_options=None, 129s include_path_column=False, 129s **kwargs, 129s ): 129s reader_name = reader.__name__ 129s if lineterminator is not None and len(lineterminator) == 1: 129s kwargs["lineterminator"] = lineterminator 129s else: 129s lineterminator = "\n" 129s if "encoding" in kwargs: 129s b_lineterminator = lineterminator.encode(kwargs["encoding"]) 129s empty_blob = "".encode(kwargs["encoding"]) 129s if empty_blob: 129s # This encoding starts with a Byte Order Mark (BOM), so strip that from the 129s # start of the line terminator, since this value is not a full file. 129s b_lineterminator = b_lineterminator[len(empty_blob) :] 129s else: 129s b_lineterminator = lineterminator.encode() 129s if include_path_column and isinstance(include_path_column, bool): 129s include_path_column = "path" 129s if "index" in kwargs or ( 129s "index_col" in kwargs and kwargs.get("index_col") is not False 129s ): 129s raise ValueError( 129s "Keywords 'index' and 'index_col' not supported, except for " 129s "'index_col=False'. Use dd.{reader_name}(...).set_index('my-index') instead" 129s ) 129s for kw in ["iterator", "chunksize"]: 129s if kw in kwargs: 129s raise ValueError(f"{kw} not supported for dd.{reader_name}") 129s if kwargs.get("nrows", None): 129s raise ValueError( 129s "The 'nrows' keyword is not supported by " 129s "`dd.{0}`. To achieve the same behavior, it's " 129s "recommended to use `dd.{0}(...)." 129s "head(n=nrows)`".format(reader_name) 129s ) 129s if isinstance(kwargs.get("skiprows"), int): 129s lastskiprow = firstrow = kwargs.get("skiprows") 129s elif kwargs.get("skiprows") is None: 129s lastskiprow = firstrow = 0 129s else: 129s # When skiprows is a list, we expect more than max(skiprows) to 129s # be included in the sample. This means that [0,2] will work well, 129s # but [0, 440] might not work. 129s skiprows = set(kwargs.get("skiprows")) 129s lastskiprow = max(skiprows) 129s # find the firstrow that is not skipped, for use as header 129s firstrow = min(set(range(len(skiprows) + 1)) - set(skiprows)) 129s if isinstance(kwargs.get("header"), list): 129s raise TypeError(f"List of header rows not supported for dd.{reader_name}") 129s if isinstance(kwargs.get("converters"), dict) and include_path_column: 129s path_converter = kwargs.get("converters").get(include_path_column, None) 129s else: 129s path_converter = None 129s 129s # If compression is "infer", inspect the (first) path suffix and 129s # set the proper compression option if the suffix is recognized. 129s if compression == "infer": 129s # Translate the input urlpath to a simple path list 129s paths = get_fs_token_paths(urlpath, mode="rb", storage_options=storage_options)[ 129s 2 129s ] 129s 129s # Check for at least one valid path 129s if len(paths) == 0: 129s > raise OSError(f"{urlpath} resolved to no files") 129s E OSError: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files 129s 129s /usr/lib/python3/dist-packages/dask/dataframe/io/csv.py:644: OSError 129s 129s The above exception was the direct cause of the following exception: 129s 129s catalog1 = 129s 129s def test_nested(catalog1): 129s assert 'nested' in catalog1 129s assert 'entry1' in catalog1.nested.nested() 129s > assert catalog1.entry1.read().equals(catalog1.nested.nested.entry1.read()) 129s 129s intake/catalog/tests/test_local.py:86: 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s intake/source/csv.py:129: in read 129s self._get_schema() 129s intake/source/csv.py:115: in _get_schema 129s self._open_dataset(urlpath) 129s intake/source/csv.py:94: in _open_dataset 129s self._dataframe = dask.dataframe.read_csv( 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s 129s args = ('/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv',) 129s kwargs = {'storage_options': None} 129s func = .read at 0x7c2b51791080> 129s exc = OSError('An error occurred while calling the read_csv method registered to the pandas backend.\nOriginal Message: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files') 129s 129s @wraps(fn) 129s def wrapper(*args, **kwargs): 129s func = getattr(self, dispatch_name) 129s try: 129s return func(*args, **kwargs) 129s except Exception as e: 129s try: 129s exc = type(e)( 129s f"An error occurred while calling the {funcname(func)} " 129s f"method registered to the {self.backend} backend.\n" 129s f"Original Message: {e}" 129s ) 129s except TypeError: 129s raise e 129s else: 129s > raise exc from e 129s E OSError: An error occurred while calling the read_csv method registered to the pandas backend. 129s E Original Message: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files 129s 129s /usr/lib/python3/dist-packages/dask/backends.py:151: OSError 129s ______________________________ test_info_describe ______________________________ 129s 129s intake_server = 'intake://localhost:7483' 129s 129s def test_info_describe(intake_server): 129s catalog = open_catalog(intake_server) 129s 129s assert_items_equal(list(catalog), ['use_example1', 'nested', 'entry1', 129s 'entry1_part', 'remote_env', 129s 'local_env', 'text', 'arr', 'datetime']) 129s 129s > info = catalog['entry1'].describe() 129s 129s intake/catalog/tests/test_remote_integration.py:29: 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s intake/catalog/base.py:436: in __getitem__ 129s s = self._get_entry(key) 129s intake/catalog/utils.py:45: in wrapper 129s return f(self, *args, **kwargs) 129s intake/catalog/base.py:323: in _get_entry 129s return entry() 129s intake/catalog/entry.py:77: in __call__ 129s s = self.get(**kwargs) 129s intake/catalog/remote.py:459: in get 129s return open_remote( 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s 129s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 129s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 129s page_size = None, persist_mode = 'default' 129s auth = , getenv = True 129s getshell = True 129s 129s def open_remote(url, entry, container, user_parameters, description, http_args, 129s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 129s """Create either local direct data source or remote streamed source""" 129s from intake.container import container_map 129s import msgpack 129s import requests 129s from requests.compat import urljoin 129s 129s if url.startswith('intake://'): 129s url = url[len('intake://'):] 129s payload = dict(action='open', 129s name=entry, 129s parameters=user_parameters, 129s available_plugins=list(plugin_registry)) 129s req = requests.post(urljoin(url, 'v1/source'), 129s data=msgpack.packb(payload, **pack_kwargs), 129s **http_args) 129s if req.ok: 129s response = msgpack.unpackb(req.content, **unpack_kwargs) 129s 129s if 'plugin' in response: 129s pl = response['plugin'] 129s pl = [pl] if isinstance(pl, str) else pl 129s # Direct access 129s for p in pl: 129s if p in plugin_registry: 129s source = plugin_registry[p](**response['args']) 129s proxy = False 129s break 129s else: 129s proxy = True 129s else: 129s proxy = True 129s if proxy: 129s response.pop('container') 129s response.update({'name': entry, 'parameters': user_parameters}) 129s if container == 'catalog': 129s response.update({'auth': auth, 129s 'getenv': getenv, 129s 'getshell': getshell, 129s 'page_size': page_size, 129s 'persist_mode': persist_mode 129s # TODO ttl? 129s # TODO storage_options? 129s }) 129s source = container_map[container](url, http_args, **response) 129s source.description = description 129s return source 129s else: 129s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 129s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files 129s 129s intake/catalog/remote.py:519: Exception 129s ---------------------------- Captured stderr setup ----------------------------- 129s 2025-11-17 16:50:25,208 - intake - INFO - __main__.py:main:L53 - Creating catalog from: 129s 2025-11-17 16:50:25,208 - intake - INFO - __main__.py:main:L55 - - /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests/catalog1.yml 129s 2025-11-17 16:50:25,384 - intake - INFO - __main__.py:main:L62 - catalog_args: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests/catalog1.yml 129s 2025-11-17 16:50:25,384 - intake - INFO - __main__.py:main:L70 - Listening on localhost:7483 129s ----------------------------- Captured stderr call ----------------------------- 129s /usr/lib/python3/dist-packages/dask/dataframe/__init__.py:49: FutureWarning: 129s Dask dataframe query planning is disabled because dask-expr is not installed. 129s 129s You can install it with `pip install dask[dataframe]` or `conda install dask`. 129s This will raise in a future version. 129s 129s warnings.warn(msg, FutureWarning) 129s Traceback (most recent call last): 129s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 129s return func(*args, **kwargs) 129s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 129s return read_pandas( 129s reader, 129s ...<10 lines>... 129s **kwargs, 129s ) 129s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 129s raise OSError(f"{urlpath} resolved to no files") 129s OSError: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files 129s 129s The above exception was the direct cause of the following exception: 129s 129s Traceback (most recent call last): 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/cli/server/server.py", line 306, in post 129s source.discover() 129s ~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/base.py", line 347, in discover 129s self._load_metadata() 129s ~~~~~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/base.py", line 285, in _load_metadata 129s self._schema = self._get_schema() 129s ~~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/csv.py", line 115, in _get_schema 129s self._open_dataset(urlpath) 129s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/csv.py", line 94, in _open_dataset 129s self._dataframe = dask.dataframe.read_csv( 129s ~~~~~~~~~~~~~~~~~~~~~~~^ 129s urlpath, storage_options=self._storage_options, 129s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 129s **self._csv_kwargs) 129s ^^^^^^^^^^^^^^^^^^^ 129s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 129s raise exc from e 129s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 129s Original Message: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files 129s 400 POST /v1/source (::1): Discover failed 129s 400 POST /v1/source (::1) 89.74ms 129s ______________________________ test_remote_direct ______________________________ 129s 129s intake_server = 'intake://localhost:7483' 129s 129s def test_remote_direct(intake_server): 129s from intake.container.dataframe import RemoteDataFrame 129s catalog = open_catalog(intake_server) 129s > s0 = catalog.entry1() 129s 129s intake/catalog/tests/test_remote_integration.py:74: 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s intake/catalog/base.py:391: in __getattr__ 129s return self[item] # triggers reload_on_change 129s intake/catalog/base.py:436: in __getitem__ 129s s = self._get_entry(key) 129s intake/catalog/utils.py:45: in wrapper 129s return f(self, *args, **kwargs) 129s intake/catalog/base.py:323: in _get_entry 129s return entry() 129s intake/catalog/entry.py:77: in __call__ 129s s = self.get(**kwargs) 129s intake/catalog/remote.py:459: in get 129s return open_remote( 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s 129s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 129s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 129s page_size = None, persist_mode = 'default' 129s auth = , getenv = True 129s getshell = True 129s 129s def open_remote(url, entry, container, user_parameters, description, http_args, 129s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 129s """Create either local direct data source or remote streamed source""" 129s from intake.container import container_map 129s import msgpack 129s import requests 129s from requests.compat import urljoin 129s 129s if url.startswith('intake://'): 129s url = url[len('intake://'):] 129s payload = dict(action='open', 129s name=entry, 129s parameters=user_parameters, 129s available_plugins=list(plugin_registry)) 129s req = requests.post(urljoin(url, 'v1/source'), 129s data=msgpack.packb(payload, **pack_kwargs), 129s **http_args) 129s if req.ok: 129s response = msgpack.unpackb(req.content, **unpack_kwargs) 129s 129s if 'plugin' in response: 129s pl = response['plugin'] 129s pl = [pl] if isinstance(pl, str) else pl 129s # Direct access 129s for p in pl: 129s if p in plugin_registry: 129s source = plugin_registry[p](**response['args']) 129s proxy = False 129s break 129s else: 129s proxy = True 129s else: 129s proxy = True 129s if proxy: 129s response.pop('container') 129s response.update({'name': entry, 'parameters': user_parameters}) 129s if container == 'catalog': 129s response.update({'auth': auth, 129s 'getenv': getenv, 129s 'getshell': getshell, 129s 'page_size': page_size, 129s 'persist_mode': persist_mode 129s # TODO ttl? 129s # TODO storage_options? 129s }) 129s source = container_map[container](url, http_args, **response) 129s source.description = description 129s return source 129s else: 129s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 129s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files 129s 129s intake/catalog/remote.py:519: Exception 129s ----------------------------- Captured stderr call ----------------------------- 129s Traceback (most recent call last): 129s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 129s return func(*args, **kwargs) 129s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 129s return read_pandas( 129s reader, 129s ...<10 lines>... 129s **kwargs, 129s ) 129s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 129s raise OSError(f"{urlpath} resolved to no files") 129s OSError: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files 129s 129s The above exception was the direct cause of the following exception: 129s 129s Traceback (most recent call last): 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/cli/server/server.py", line 306, in post 129s source.discover() 129s ~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/base.py", line 347, in discover 129s self._load_metadata() 129s ~~~~~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/base.py", line 285, in _load_metadata 129s self._schema = self._get_schema() 129s ~~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/csv.py", line 115, in _get_schema 129s self._open_dataset(urlpath) 129s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/csv.py", line 94, in _open_dataset 129s self._dataframe = dask.dataframe.read_csv( 129s ~~~~~~~~~~~~~~~~~~~~~~~^ 129s urlpath, storage_options=self._storage_options, 129s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 129s **self._csv_kwargs) 129s ^^^^^^^^^^^^^^^^^^^ 129s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 129s raise exc from e 129s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 129s Original Message: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files 129s 400 POST /v1/source (::1): Discover failed 129s 400 POST /v1/source (::1) 2.01ms 129s _______________________ test_remote_datasource_interface _______________________ 129s 129s intake_server = 'intake://localhost:7483' 129s 129s def test_remote_datasource_interface(intake_server): 129s catalog = open_catalog(intake_server) 129s 129s > d = catalog['entry1'] 129s 129s intake/catalog/tests/test_remote_integration.py:101: 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s intake/catalog/base.py:436: in __getitem__ 129s s = self._get_entry(key) 129s intake/catalog/utils.py:45: in wrapper 129s return f(self, *args, **kwargs) 129s intake/catalog/base.py:323: in _get_entry 129s return entry() 129s intake/catalog/entry.py:77: in __call__ 129s s = self.get(**kwargs) 129s intake/catalog/remote.py:459: in get 129s return open_remote( 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s 129s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 129s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 129s page_size = None, persist_mode = 'default' 129s auth = , getenv = True 129s getshell = True 129s 129s def open_remote(url, entry, container, user_parameters, description, http_args, 129s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 129s """Create either local direct data source or remote streamed source""" 129s from intake.container import container_map 129s import msgpack 129s import requests 129s from requests.compat import urljoin 129s 129s if url.startswith('intake://'): 129s url = url[len('intake://'):] 129s payload = dict(action='open', 129s name=entry, 129s parameters=user_parameters, 129s available_plugins=list(plugin_registry)) 129s req = requests.post(urljoin(url, 'v1/source'), 129s data=msgpack.packb(payload, **pack_kwargs), 129s **http_args) 129s if req.ok: 129s response = msgpack.unpackb(req.content, **unpack_kwargs) 129s 129s if 'plugin' in response: 129s pl = response['plugin'] 129s pl = [pl] if isinstance(pl, str) else pl 129s # Direct access 129s for p in pl: 129s if p in plugin_registry: 129s source = plugin_registry[p](**response['args']) 129s proxy = False 129s break 129s else: 129s proxy = True 129s else: 129s proxy = True 129s if proxy: 129s response.pop('container') 129s response.update({'name': entry, 'parameters': user_parameters}) 129s if container == 'catalog': 129s response.update({'auth': auth, 129s 'getenv': getenv, 129s 'getshell': getshell, 129s 'page_size': page_size, 129s 'persist_mode': persist_mode 129s # TODO ttl? 129s # TODO storage_options? 129s }) 129s source = container_map[container](url, http_args, **response) 129s source.description = description 129s return source 129s else: 129s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 129s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files 129s 129s intake/catalog/remote.py:519: Exception 129s ----------------------------- Captured stderr call ----------------------------- 129s Traceback (most recent call last): 129s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 129s return func(*args, **kwargs) 129s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 129s return read_pandas( 129s reader, 129s ...<10 lines>... 129s **kwargs, 129s ) 129s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 129s raise OSError(f"{urlpath} resolved to no files") 129s OSError: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files 129s 129s The above exception was the direct cause of the following exception: 129s 129s Traceback (most recent call last): 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/cli/server/server.py", line 306, in post 129s source.discover() 129s ~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/base.py", line 347, in discover 129s self._load_metadata() 129s ~~~~~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/base.py", line 285, in _load_metadata 129s self._schema = self._get_schema() 129s ~~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/csv.py", line 115, in _get_schema 129s self._open_dataset(urlpath) 129s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/csv.py", line 94, in _open_dataset 129s self._dataframe = dask.dataframe.read_csv( 129s ~~~~~~~~~~~~~~~~~~~~~~~^ 129s urlpath, storage_options=self._storage_options, 129s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 129s **self._csv_kwargs) 129s ^^^^^^^^^^^^^^^^^^^ 129s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 129s raise exc from e 129s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 129s Original Message: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files 129s 400 POST /v1/source (::1): Discover failed 129s 400 POST /v1/source (::1) 1.79ms 129s __________________________________ test_read ___________________________________ 129s 129s intake_server = 'intake://localhost:7483' 129s 129s def test_read(intake_server): 129s catalog = open_catalog(intake_server) 129s 129s > d = catalog['entry1'] 129s 129s intake/catalog/tests/test_remote_integration.py:116: 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s intake/catalog/base.py:436: in __getitem__ 129s s = self._get_entry(key) 129s intake/catalog/utils.py:45: in wrapper 129s return f(self, *args, **kwargs) 129s intake/catalog/base.py:323: in _get_entry 129s return entry() 129s intake/catalog/entry.py:77: in __call__ 129s s = self.get(**kwargs) 129s intake/catalog/remote.py:459: in get 129s return open_remote( 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s 129s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 129s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 129s page_size = None, persist_mode = 'default' 129s auth = , getenv = True 129s getshell = True 129s 129s def open_remote(url, entry, container, user_parameters, description, http_args, 129s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 129s """Create either local direct data source or remote streamed source""" 129s from intake.container import container_map 129s import msgpack 129s import requests 129s from requests.compat import urljoin 129s 129s if url.startswith('intake://'): 129s url = url[len('intake://'):] 129s payload = dict(action='open', 129s name=entry, 129s parameters=user_parameters, 129s available_plugins=list(plugin_registry)) 129s req = requests.post(urljoin(url, 'v1/source'), 129s data=msgpack.packb(payload, **pack_kwargs), 129s **http_args) 129s if req.ok: 129s response = msgpack.unpackb(req.content, **unpack_kwargs) 129s 129s if 'plugin' in response: 129s pl = response['plugin'] 129s pl = [pl] if isinstance(pl, str) else pl 129s # Direct access 129s for p in pl: 129s if p in plugin_registry: 129s source = plugin_registry[p](**response['args']) 129s proxy = False 129s break 129s else: 129s proxy = True 129s else: 129s proxy = True 129s if proxy: 129s response.pop('container') 129s response.update({'name': entry, 'parameters': user_parameters}) 129s if container == 'catalog': 129s response.update({'auth': auth, 129s 'getenv': getenv, 129s 'getshell': getshell, 129s 'page_size': page_size, 129s 'persist_mode': persist_mode 129s # TODO ttl? 129s # TODO storage_options? 129s }) 129s source = container_map[container](url, http_args, **response) 129s source.description = description 129s return source 129s else: 129s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 129s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files 129s 129s intake/catalog/remote.py:519: Exception 129s ----------------------------- Captured stderr call ----------------------------- 129s Traceback (most recent call last): 129s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 129s return func(*args, **kwargs) 129s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 129s return read_pandas( 129s reader, 129s ...<10 lines>... 129s **kwargs, 129s ) 129s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 129s raise OSError(f"{urlpath} resolved to no files") 129s OSError: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files 129s 129s The above exception was the direct cause of the following exception: 129s 129s Traceback (most recent call last): 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/cli/server/server.py", line 306, in post 129s source.discover() 129s ~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/base.py", line 347, in discover 129s self._load_metadata() 129s ~~~~~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/base.py", line 285, in _load_metadata 129s self._schema = self._get_schema() 129s ~~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/csv.py", line 115, in _get_schema 129s self._open_dataset(urlpath) 129s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/csv.py", line 94, in _open_dataset 129s self._dataframe = dask.dataframe.read_csv( 129s ~~~~~~~~~~~~~~~~~~~~~~~^ 129s urlpath, storage_options=self._storage_options, 129s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 129s **self._csv_kwargs) 129s ^^^^^^^^^^^^^^^^^^^ 129s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 129s raise exc from e 129s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 129s Original Message: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files 129s 400 POST /v1/source (::1): Discover failed 129s 400 POST /v1/source (::1) 1.77ms 129s _______________________________ test_read_chunks _______________________________ 129s 129s intake_server = 'intake://localhost:7483' 129s 129s def test_read_chunks(intake_server): 129s catalog = open_catalog(intake_server) 129s 129s > d = catalog.entry1 129s 129s intake/catalog/tests/test_remote_integration.py:170: 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s intake/catalog/base.py:391: in __getattr__ 129s return self[item] # triggers reload_on_change 129s intake/catalog/base.py:436: in __getitem__ 129s s = self._get_entry(key) 129s intake/catalog/utils.py:45: in wrapper 129s return f(self, *args, **kwargs) 129s intake/catalog/base.py:323: in _get_entry 129s return entry() 129s intake/catalog/entry.py:77: in __call__ 129s s = self.get(**kwargs) 129s intake/catalog/remote.py:459: in get 129s return open_remote( 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s 129s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 129s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 129s page_size = None, persist_mode = 'default' 129s auth = , getenv = True 129s getshell = True 129s 129s def open_remote(url, entry, container, user_parameters, description, http_args, 129s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 129s """Create either local direct data source or remote streamed source""" 129s from intake.container import container_map 129s import msgpack 129s import requests 129s from requests.compat import urljoin 129s 129s if url.startswith('intake://'): 129s url = url[len('intake://'):] 129s payload = dict(action='open', 129s name=entry, 129s parameters=user_parameters, 129s available_plugins=list(plugin_registry)) 129s req = requests.post(urljoin(url, 'v1/source'), 129s data=msgpack.packb(payload, **pack_kwargs), 129s **http_args) 129s if req.ok: 129s response = msgpack.unpackb(req.content, **unpack_kwargs) 129s 129s if 'plugin' in response: 129s pl = response['plugin'] 129s pl = [pl] if isinstance(pl, str) else pl 129s # Direct access 129s for p in pl: 129s if p in plugin_registry: 129s source = plugin_registry[p](**response['args']) 129s proxy = False 129s break 129s else: 129s proxy = True 129s else: 129s proxy = True 129s if proxy: 129s response.pop('container') 129s response.update({'name': entry, 'parameters': user_parameters}) 129s if container == 'catalog': 129s response.update({'auth': auth, 129s 'getenv': getenv, 129s 'getshell': getshell, 129s 'page_size': page_size, 129s 'persist_mode': persist_mode 129s # TODO ttl? 129s # TODO storage_options? 129s }) 129s source = container_map[container](url, http_args, **response) 129s source.description = description 129s return source 129s else: 129s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 129s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files 129s 129s intake/catalog/remote.py:519: Exception 129s ----------------------------- Captured stderr call ----------------------------- 129s Traceback (most recent call last): 129s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 129s return func(*args, **kwargs) 129s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 129s return read_pandas( 129s reader, 129s ...<10 lines>... 129s **kwargs, 129s ) 129s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 129s raise OSError(f"{urlpath} resolved to no files") 129s OSError: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files 129s 129s The above exception was the direct cause of the following exception: 129s 129s Traceback (most recent call last): 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/cli/server/server.py", line 306, in post 129s source.discover() 129s ~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/base.py", line 347, in discover 129s self._load_metadata() 129s ~~~~~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/base.py", line 285, in _load_metadata 129s self._schema = self._get_schema() 129s ~~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/csv.py", line 115, in _get_schema 129s self._open_dataset(urlpath) 129s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/csv.py", line 94, in _open_dataset 129s self._dataframe = dask.dataframe.read_csv( 129s ~~~~~~~~~~~~~~~~~~~~~~~^ 129s urlpath, storage_options=self._storage_options, 129s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 129s **self._csv_kwargs) 129s ^^^^^^^^^^^^^^^^^^^ 129s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 129s raise exc from e 129s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 129s Original Message: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files 129s 400 POST /v1/source (::1): Discover failed 129s 400 POST /v1/source (::1) 1.95ms 129s _____________________________ test_read_partition ______________________________ 129s 129s intake_server = 'intake://localhost:7483' 129s 129s def test_read_partition(intake_server): 129s catalog = open_catalog(intake_server) 129s 129s > d = catalog.entry1 129s 129s intake/catalog/tests/test_remote_integration.py:186: 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s intake/catalog/base.py:391: in __getattr__ 129s return self[item] # triggers reload_on_change 129s intake/catalog/base.py:436: in __getitem__ 129s s = self._get_entry(key) 129s intake/catalog/utils.py:45: in wrapper 129s return f(self, *args, **kwargs) 129s intake/catalog/base.py:323: in _get_entry 129s return entry() 129s intake/catalog/entry.py:77: in __call__ 129s s = self.get(**kwargs) 129s intake/catalog/remote.py:459: in get 129s return open_remote( 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s 129s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 129s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 129s page_size = None, persist_mode = 'default' 129s auth = , getenv = True 129s getshell = True 129s 129s def open_remote(url, entry, container, user_parameters, description, http_args, 129s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 129s """Create either local direct data source or remote streamed source""" 129s from intake.container import container_map 129s import msgpack 129s import requests 129s from requests.compat import urljoin 129s 129s if url.startswith('intake://'): 129s url = url[len('intake://'):] 129s payload = dict(action='open', 129s name=entry, 129s parameters=user_parameters, 129s available_plugins=list(plugin_registry)) 129s req = requests.post(urljoin(url, 'v1/source'), 129s data=msgpack.packb(payload, **pack_kwargs), 129s **http_args) 129s if req.ok: 129s response = msgpack.unpackb(req.content, **unpack_kwargs) 129s 129s if 'plugin' in response: 129s pl = response['plugin'] 129s pl = [pl] if isinstance(pl, str) else pl 129s # Direct access 129s for p in pl: 129s if p in plugin_registry: 129s source = plugin_registry[p](**response['args']) 129s proxy = False 129s break 129s else: 129s proxy = True 129s else: 129s proxy = True 129s if proxy: 129s response.pop('container') 129s response.update({'name': entry, 'parameters': user_parameters}) 129s if container == 'catalog': 129s response.update({'auth': auth, 129s 'getenv': getenv, 129s 'getshell': getshell, 129s 'page_size': page_size, 129s 'persist_mode': persist_mode 129s # TODO ttl? 129s # TODO storage_options? 129s }) 129s source = container_map[container](url, http_args, **response) 129s source.description = description 129s return source 129s else: 129s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 129s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files 129s 129s intake/catalog/remote.py:519: Exception 129s ----------------------------- Captured stderr call ----------------------------- 129s Traceback (most recent call last): 129s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 129s return func(*args, **kwargs) 129s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 129s return read_pandas( 129s reader, 129s ...<10 lines>... 129s **kwargs, 129s ) 129s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 129s raise OSError(f"{urlpath} resolved to no files") 129s OSError: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files 129s 129s The above exception was the direct cause of the following exception: 129s 129s Traceback (most recent call last): 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/cli/server/server.py", line 306, in post 129s source.discover() 129s ~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/base.py", line 347, in discover 129s self._load_metadata() 129s ~~~~~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/base.py", line 285, in _load_metadata 129s self._schema = self._get_schema() 129s ~~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/csv.py", line 115, in _get_schema 129s self._open_dataset(urlpath) 129s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/csv.py", line 94, in _open_dataset 129s self._dataframe = dask.dataframe.read_csv( 129s ~~~~~~~~~~~~~~~~~~~~~~~^ 129s urlpath, storage_options=self._storage_options, 129s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 129s **self._csv_kwargs) 129s ^^^^^^^^^^^^^^^^^^^ 129s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 129s raise exc from e 129s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 129s Original Message: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files 129s 400 POST /v1/source (::1): Discover failed 129s 400 POST /v1/source (::1) 1.85ms 129s __________________________________ test_close __________________________________ 129s 129s intake_server = 'intake://localhost:7483' 129s 129s def test_close(intake_server): 129s catalog = open_catalog(intake_server) 129s 129s > d = catalog.entry1 129s 129s intake/catalog/tests/test_remote_integration.py:201: 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s intake/catalog/base.py:391: in __getattr__ 129s return self[item] # triggers reload_on_change 129s intake/catalog/base.py:436: in __getitem__ 129s s = self._get_entry(key) 129s intake/catalog/utils.py:45: in wrapper 129s return f(self, *args, **kwargs) 129s intake/catalog/base.py:323: in _get_entry 129s return entry() 129s intake/catalog/entry.py:77: in __call__ 129s s = self.get(**kwargs) 129s intake/catalog/remote.py:459: in get 129s return open_remote( 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s 129s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 129s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 129s page_size = None, persist_mode = 'default' 129s auth = , getenv = True 129s getshell = True 129s 129s def open_remote(url, entry, container, user_parameters, description, http_args, 129s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 129s """Create either local direct data source or remote streamed source""" 129s from intake.container import container_map 129s import msgpack 129s import requests 129s from requests.compat import urljoin 129s 129s if url.startswith('intake://'): 129s url = url[len('intake://'):] 129s payload = dict(action='open', 129s name=entry, 129s parameters=user_parameters, 129s available_plugins=list(plugin_registry)) 129s req = requests.post(urljoin(url, 'v1/source'), 129s data=msgpack.packb(payload, **pack_kwargs), 129s **http_args) 129s if req.ok: 129s response = msgpack.unpackb(req.content, **unpack_kwargs) 129s 129s if 'plugin' in response: 129s pl = response['plugin'] 129s pl = [pl] if isinstance(pl, str) else pl 129s # Direct access 129s for p in pl: 129s if p in plugin_registry: 129s source = plugin_registry[p](**response['args']) 129s proxy = False 129s break 129s else: 129s proxy = True 129s else: 129s proxy = True 129s if proxy: 129s response.pop('container') 129s response.update({'name': entry, 'parameters': user_parameters}) 129s if container == 'catalog': 129s response.update({'auth': auth, 129s 'getenv': getenv, 129s 'getshell': getshell, 129s 'page_size': page_size, 129s 'persist_mode': persist_mode 129s # TODO ttl? 129s # TODO storage_options? 129s }) 129s source = container_map[container](url, http_args, **response) 129s source.description = description 129s return source 129s else: 129s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 129s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files 129s 129s intake/catalog/remote.py:519: Exception 129s ----------------------------- Captured stderr call ----------------------------- 129s Traceback (most recent call last): 129s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 129s return func(*args, **kwargs) 129s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 129s return read_pandas( 129s reader, 129s ...<10 lines>... 129s **kwargs, 129s ) 129s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 129s raise OSError(f"{urlpath} resolved to no files") 129s OSError: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files 129s 129s The above exception was the direct cause of the following exception: 129s 129s Traceback (most recent call last): 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/cli/server/server.py", line 306, in post 129s source.discover() 129s ~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/base.py", line 347, in discover 129s self._load_metadata() 129s ~~~~~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/base.py", line 285, in _load_metadata 129s self._schema = self._get_schema() 129s ~~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/csv.py", line 115, in _get_schema 129s self._open_dataset(urlpath) 129s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/csv.py", line 94, in _open_dataset 129s self._dataframe = dask.dataframe.read_csv( 129s ~~~~~~~~~~~~~~~~~~~~~~~^ 129s urlpath, storage_options=self._storage_options, 129s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 129s **self._csv_kwargs) 129s ^^^^^^^^^^^^^^^^^^^ 129s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 129s raise exc from e 129s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 129s Original Message: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files 129s 400 POST /v1/source (::1): Discover failed 129s 400 POST /v1/source (::1) 1.86ms 129s __________________________________ test_with ___________________________________ 129s 129s intake_server = 'intake://localhost:7483' 129s 129s def test_with(intake_server): 129s catalog = open_catalog(intake_server) 129s 129s > with catalog.entry1 as f: 129s 129s intake/catalog/tests/test_remote_integration.py:208: 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s intake/catalog/base.py:391: in __getattr__ 129s return self[item] # triggers reload_on_change 129s intake/catalog/base.py:436: in __getitem__ 129s s = self._get_entry(key) 129s intake/catalog/utils.py:45: in wrapper 129s return f(self, *args, **kwargs) 129s intake/catalog/base.py:323: in _get_entry 129s return entry() 129s intake/catalog/entry.py:77: in __call__ 129s s = self.get(**kwargs) 129s intake/catalog/remote.py:459: in get 129s return open_remote( 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s 129s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 129s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 129s page_size = None, persist_mode = 'default' 129s auth = , getenv = True 129s getshell = True 129s 129s def open_remote(url, entry, container, user_parameters, description, http_args, 129s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 129s """Create either local direct data source or remote streamed source""" 129s from intake.container import container_map 129s import msgpack 129s import requests 129s from requests.compat import urljoin 129s 129s if url.startswith('intake://'): 129s url = url[len('intake://'):] 129s payload = dict(action='open', 129s name=entry, 129s parameters=user_parameters, 129s available_plugins=list(plugin_registry)) 129s req = requests.post(urljoin(url, 'v1/source'), 129s data=msgpack.packb(payload, **pack_kwargs), 129s **http_args) 129s if req.ok: 129s response = msgpack.unpackb(req.content, **unpack_kwargs) 129s 129s if 'plugin' in response: 129s pl = response['plugin'] 129s pl = [pl] if isinstance(pl, str) else pl 129s # Direct access 129s for p in pl: 129s if p in plugin_registry: 129s source = plugin_registry[p](**response['args']) 129s proxy = False 129s break 129s else: 129s proxy = True 129s else: 129s proxy = True 129s if proxy: 129s response.pop('container') 129s response.update({'name': entry, 'parameters': user_parameters}) 129s if container == 'catalog': 129s response.update({'auth': auth, 129s 'getenv': getenv, 129s 'getshell': getshell, 129s 'page_size': page_size, 129s 'persist_mode': persist_mode 129s # TODO ttl? 129s # TODO storage_options? 129s }) 129s source = container_map[container](url, http_args, **response) 129s source.description = description 129s return source 129s else: 129s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 129s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files 129s 129s intake/catalog/remote.py:519: Exception 129s ----------------------------- Captured stderr call ----------------------------- 129s Traceback (most recent call last): 129s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 129s return func(*args, **kwargs) 129s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 129s return read_pandas( 129s reader, 129s ...<10 lines>... 129s **kwargs, 129s ) 129s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 129s raise OSError(f"{urlpath} resolved to no files") 129s OSError: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files 129s 129s The above exception was the direct cause of the following exception: 129s 129s Traceback (most recent call last): 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/cli/server/server.py", line 306, in post 129s source.discover() 129s ~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/base.py", line 347, in discover 129s self._load_metadata() 129s ~~~~~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/base.py", line 285, in _load_metadata 129s self._schema = self._get_schema() 129s ~~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/csv.py", line 115, in _get_schema 129s self._open_dataset(urlpath) 129s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/csv.py", line 94, in _open_dataset 129s self._dataframe = dask.dataframe.read_csv( 129s ~~~~~~~~~~~~~~~~~~~~~~~^ 129s urlpath, storage_options=self._storage_options, 129s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 129s **self._csv_kwargs) 129s ^^^^^^^^^^^^^^^^^^^ 129s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 129s raise exc from e 129s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 129s Original Message: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files 129s 400 POST /v1/source (::1): Discover failed 129s 400 POST /v1/source (::1) 1.79ms 129s _________________________________ test_pickle __________________________________ 129s 129s intake_server = 'intake://localhost:7483' 129s 129s def test_pickle(intake_server): 129s catalog = open_catalog(intake_server) 129s 129s > d = catalog.entry1 129s 129s intake/catalog/tests/test_remote_integration.py:215: 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s intake/catalog/base.py:391: in __getattr__ 129s return self[item] # triggers reload_on_change 129s intake/catalog/base.py:436: in __getitem__ 129s s = self._get_entry(key) 129s intake/catalog/utils.py:45: in wrapper 129s return f(self, *args, **kwargs) 129s intake/catalog/base.py:323: in _get_entry 129s return entry() 129s intake/catalog/entry.py:77: in __call__ 129s s = self.get(**kwargs) 129s intake/catalog/remote.py:459: in get 129s return open_remote( 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s 129s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 129s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 129s page_size = None, persist_mode = 'default' 129s auth = , getenv = True 129s getshell = True 129s 129s def open_remote(url, entry, container, user_parameters, description, http_args, 129s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 129s """Create either local direct data source or remote streamed source""" 129s from intake.container import container_map 129s import msgpack 129s import requests 129s from requests.compat import urljoin 129s 129s if url.startswith('intake://'): 129s url = url[len('intake://'):] 129s payload = dict(action='open', 129s name=entry, 129s parameters=user_parameters, 129s available_plugins=list(plugin_registry)) 129s req = requests.post(urljoin(url, 'v1/source'), 129s data=msgpack.packb(payload, **pack_kwargs), 129s **http_args) 129s if req.ok: 129s response = msgpack.unpackb(req.content, **unpack_kwargs) 129s 129s if 'plugin' in response: 129s pl = response['plugin'] 129s pl = [pl] if isinstance(pl, str) else pl 129s # Direct access 129s for p in pl: 129s if p in plugin_registry: 129s source = plugin_registry[p](**response['args']) 129s proxy = False 129s break 129s else: 129s proxy = True 129s else: 129s proxy = True 129s if proxy: 129s response.pop('container') 129s response.update({'name': entry, 'parameters': user_parameters}) 129s if container == 'catalog': 129s response.update({'auth': auth, 129s 'getenv': getenv, 129s 'getshell': getshell, 129s 'page_size': page_size, 129s 'persist_mode': persist_mode 129s # TODO ttl? 129s # TODO storage_options? 129s }) 129s source = container_map[container](url, http_args, **response) 129s source.description = description 129s return source 129s else: 129s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 129s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files 129s 129s intake/catalog/remote.py:519: Exception 129s ----------------------------- Captured stderr call ----------------------------- 129s Traceback (most recent call last): 129s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 129s return func(*args, **kwargs) 129s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 129s return read_pandas( 129s reader, 129s ...<10 lines>... 129s **kwargs, 129s ) 129s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 129s raise OSError(f"{urlpath} resolved to no files") 129s OSError: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files 129s 129s The above exception was the direct cause of the following exception: 129s 129s Traceback (most recent call last): 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/cli/server/server.py", line 306, in post 129s source.discover() 129s ~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/base.py", line 347, in discover 129s self._load_metadata() 129s ~~~~~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/base.py", line 285, in _load_metadata 129s self._schema = self._get_schema() 129s ~~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/csv.py", line 115, in _get_schema 129s self._open_dataset(urlpath) 129s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/csv.py", line 94, in _open_dataset 129s self._dataframe = dask.dataframe.read_csv( 129s ~~~~~~~~~~~~~~~~~~~~~~~^ 129s urlpath, storage_options=self._storage_options, 129s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 129s **self._csv_kwargs) 129s ^^^^^^^^^^^^^^^^^^^ 129s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 129s raise exc from e 129s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 129s Original Message: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files 129s 400 POST /v1/source (::1): Discover failed 129s 400 POST /v1/source (::1) 1.78ms 129s _________________________________ test_to_dask _________________________________ 129s 129s intake_server = 'intake://localhost:7483' 129s 129s def test_to_dask(intake_server): 129s catalog = open_catalog(intake_server) 129s > d = catalog.entry1 129s 129s intake/catalog/tests/test_remote_integration.py:231: 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s intake/catalog/base.py:391: in __getattr__ 129s return self[item] # triggers reload_on_change 129s intake/catalog/base.py:436: in __getitem__ 129s s = self._get_entry(key) 129s intake/catalog/utils.py:45: in wrapper 129s return f(self, *args, **kwargs) 129s intake/catalog/base.py:323: in _get_entry 129s return entry() 129s intake/catalog/entry.py:77: in __call__ 129s s = self.get(**kwargs) 129s intake/catalog/remote.py:459: in get 129s return open_remote( 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s 129s url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe' 129s user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}} 129s page_size = None, persist_mode = 'default' 129s auth = , getenv = True 129s getshell = True 129s 129s def open_remote(url, entry, container, user_parameters, description, http_args, 129s page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None): 129s """Create either local direct data source or remote streamed source""" 129s from intake.container import container_map 129s import msgpack 129s import requests 129s from requests.compat import urljoin 129s 129s if url.startswith('intake://'): 129s url = url[len('intake://'):] 129s payload = dict(action='open', 129s name=entry, 129s parameters=user_parameters, 129s available_plugins=list(plugin_registry)) 129s req = requests.post(urljoin(url, 'v1/source'), 129s data=msgpack.packb(payload, **pack_kwargs), 129s **http_args) 129s if req.ok: 129s response = msgpack.unpackb(req.content, **unpack_kwargs) 129s 129s if 'plugin' in response: 129s pl = response['plugin'] 129s pl = [pl] if isinstance(pl, str) else pl 129s # Direct access 129s for p in pl: 129s if p in plugin_registry: 129s source = plugin_registry[p](**response['args']) 129s proxy = False 129s break 129s else: 129s proxy = True 129s else: 129s proxy = True 129s if proxy: 129s response.pop('container') 129s response.update({'name': entry, 'parameters': user_parameters}) 129s if container == 'catalog': 129s response.update({'auth': auth, 129s 'getenv': getenv, 129s 'getshell': getshell, 129s 'page_size': page_size, 129s 'persist_mode': persist_mode 129s # TODO ttl? 129s # TODO storage_options? 129s }) 129s source = container_map[container](url, http_args, **response) 129s source.description = description 129s return source 129s else: 129s > raise Exception('Server error: %d, %s' % (req.status_code, req.reason)) 129s E Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files 129s 129s intake/catalog/remote.py:519: Exception 129s ----------------------------- Captured stderr call ----------------------------- 129s Traceback (most recent call last): 129s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 129s return func(*args, **kwargs) 129s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 129s return read_pandas( 129s reader, 129s ...<10 lines>... 129s **kwargs, 129s ) 129s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 129s raise OSError(f"{urlpath} resolved to no files") 129s OSError: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files 129s 129s The above exception was the direct cause of the following exception: 129s 129s Traceback (most recent call last): 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/cli/server/server.py", line 306, in post 129s source.discover() 129s ~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/base.py", line 347, in discover 129s self._load_metadata() 129s ~~~~~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/base.py", line 285, in _load_metadata 129s self._schema = self._get_schema() 129s ~~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/csv.py", line 115, in _get_schema 129s self._open_dataset(urlpath) 129s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/csv.py", line 94, in _open_dataset 129s self._dataframe = dask.dataframe.read_csv( 129s ~~~~~~~~~~~~~~~~~~~~~~~^ 129s urlpath, storage_options=self._storage_options, 129s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 129s **self._csv_kwargs) 129s ^^^^^^^^^^^^^^^^^^^ 129s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 129s raise exc from e 129s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 129s Original Message: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests//entry1_*.csv resolved to no files 129s 400 POST /v1/source (::1): Discover failed 129s 400 POST /v1/source (::1) 1.83ms 129s _____________________________ test_remote_sequence _____________________________ 129s 129s intake_server = 'intake://localhost:7483' 129s 129s def test_remote_sequence(intake_server): 129s import glob 129s d = os.path.dirname(TEST_CATALOG_PATH) 129s catalog = open_catalog(intake_server) 129s assert 'text' in catalog 129s s = catalog.text() 129s s.discover() 129s > assert s.npartitions == len(glob.glob(os.path.join(d, '*.yml'))) 129s E AssertionError: assert 0 == 29 129s E + where 0 = sources:\n text:\n args:\n dtype: null\n extra_metadata:\n catalog_dir: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests/\n headers:\n headers: {}\n name: text\n npartitions: 0\n parameters: {}\n shape:\n - null\n source_id: 9ca6c417-f6bd-4a40-9d0d-e26b1590da59\n url: http://localhost:7483/\n description: textfiles in this dir\n driver: intake.container.semistructured.RemoteSequenceSource\n metadata:\n catalog_dir: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests/\n.npartitions 129s E + and 29 = len(['/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests/data_source_value_non_dict.yml', '/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests/data_source_name_non_string.yml', '/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests/params_value_bad_choice.yml', '/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests/params_name_non_string.yml', '/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests/params_missing_required.yml', '/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests/catalog1.yml', ...]) 129s E + where ['/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests/data_source_value_non_dict.yml', '/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests/data_source_name_non_string.yml', '/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests/params_value_bad_choice.yml', '/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests/params_name_non_string.yml', '/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests/params_missing_required.yml', '/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests/catalog1.yml', ...] = ('/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests/*.yml') 129s E + where = .glob 129s E + and '/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests/*.yml' = ('/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/catalog/tests', '*.yml') 129s E + where = .join 129s E + where = os.path 129s 129s intake/catalog/tests/test_remote_integration.py:263: AssertionError 129s ___________________________________ test_dir ___________________________________ 129s 129s intake_server = 'intake://localhost:7483' 129s 129s def test_dir(intake_server): 129s PAGE_SIZE = 2 129s catalog = open_catalog(intake_server, page_size=PAGE_SIZE) 129s assert len(catalog._entries._page_cache) == 0 129s assert len(catalog._entries._direct_lookup_cache) == 0 129s assert not catalog._entries.complete 129s 129s with pytest.warns(UserWarning, match="Tab-complete"): 129s key_completions = catalog._ipython_key_completions_() 129s with pytest.warns(UserWarning, match="Tab-complete"): 129s dir_ = dir(catalog) 129s # __dir__ triggers loading the first page. 129s assert len(catalog._entries._page_cache) == 2 129s assert len(catalog._entries._direct_lookup_cache) == 0 129s assert not catalog._entries.complete 129s assert set(key_completions) == set(['use_example1', 'nested']) 129s assert 'metadata' in dir_ # a normal attribute 129s assert 'use_example1' in dir_ # an entry from the first page 129s assert 'arr' not in dir_ # an entry we haven't cached yet 129s 129s # Trigger fetching one specific name. 129s catalog['arr'] 129s with pytest.warns(UserWarning, match="Tab-complete"): 129s dir_ = dir(catalog) 129s with pytest.warns(UserWarning, match="Tab-complete"): 129s key_completions = catalog._ipython_key_completions_() 129s assert 'metadata' in dir_ 129s assert 'arr' in dir_ # an entry cached via direct access 129s assert 'arr' in key_completions 129s 129s # Load everything. 129s list(catalog) 129s assert catalog._entries.complete 129s > with pytest.warns(None) as record: 129s 129s intake/catalog/tests/test_remote_integration.py:338: 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s 129s self = WarningsChecker(record=True), expected_warning = None, match_expr = None 129s 129s def __init__( 129s self, 129s expected_warning: type[Warning] | tuple[type[Warning], ...] = Warning, 129s match_expr: str | Pattern[str] | None = None, 129s *, 129s _ispytest: bool = False, 129s ) -> None: 129s check_ispytest(_ispytest) 129s super().__init__(_ispytest=True) 129s 129s msg = "exceptions must be derived from Warning, not %s" 129s if isinstance(expected_warning, tuple): 129s for exc in expected_warning: 129s if not issubclass(exc, Warning): 129s raise TypeError(msg % type(exc)) 129s expected_warning_tup = expected_warning 129s elif isinstance(expected_warning, type) and issubclass( 129s expected_warning, Warning 129s ): 129s expected_warning_tup = (expected_warning,) 129s else: 129s > raise TypeError(msg % type(expected_warning)) 129s E TypeError: exceptions must be derived from Warning, not 129s 129s /usr/lib/python3/dist-packages/_pytest/recwarn.py:279: TypeError 129s ________________________________ test_discover _________________________________ 129s 129s def test_discover(): 129s cmd = [ex, '-m', 'intake.cli.client', 'discover', TEST_CATALOG_YAML, 129s 'entry1'] 129s process = subprocess.Popen(cmd, stdout=subprocess.PIPE, 129s universal_newlines=True) 129s out, _ = process.communicate() 129s 129s > assert "'dtype':" in out 129s E assert "'dtype':" in '' 129s 129s intake/cli/client/tests/test_local_integration.py:89: AssertionError 129s ----------------------------- Captured stderr call ----------------------------- 129s /usr/lib/python3/dist-packages/dask/dataframe/__init__.py:49: FutureWarning: 129s Dask dataframe query planning is disabled because dask-expr is not installed. 129s 129s You can install it with `pip install dask[dataframe]` or `conda install dask`. 129s This will raise in a future version. 129s 129s warnings.warn(msg, FutureWarning) 129s ERROR: OSError('An error occurred while calling the read_csv method registered to the pandas backend.\nOriginal Message: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/cli/client/tests//entry1_*.csv resolved to no files') 129s ________________________________ test_get_pass _________________________________ 129s 129s def test_get_pass(): 129s cmd = [ex, '-m', 'intake.cli.client', 'get', TEST_CATALOG_YAML, 'entry1'] 129s process = subprocess.Popen(cmd, stdout=subprocess.PIPE, 129s universal_newlines=True) 129s out, _ = process.communicate() 129s 129s > assert 'Charlie1 25.0 3' in out 129s E AssertionError: assert 'Charlie1 25.0 3' in '' 129s 129s intake/cli/client/tests/test_local_integration.py:101: AssertionError 129s ----------------------------- Captured stderr call ----------------------------- 129s /usr/lib/python3/dist-packages/dask/dataframe/__init__.py:49: FutureWarning: 129s Dask dataframe query planning is disabled because dask-expr is not installed. 129s 129s You can install it with `pip install dask[dataframe]` or `conda install dask`. 129s This will raise in a future version. 129s 129s warnings.warn(msg, FutureWarning) 129s ERROR: OSError('An error occurred while calling the read_csv method registered to the pandas backend.\nOriginal Message: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/cli/client/tests//entry1_*.csv resolved to no files') 129s ______________________ TestServerV1Source.test_idle_timer ______________________ 129s 129s self = 129s 129s def test_idle_timer(self): 129s self.server.start_periodic_functions(close_idle_after=0.1, 129s remove_idle_after=0.2) 129s 129s msg = dict(action='open', name='entry1', parameters={}) 129s > resp_msg, = self.make_post_request(msg) 129s 129s intake/cli/server/tests/test_server.py:208: 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s intake/cli/server/tests/test_server.py:96: in make_post_request 129s self.assertEqual(response.code, expected_status) 129s E AssertionError: 400 != 200 129s ----------------------------- Captured stderr call ----------------------------- 129s Traceback (most recent call last): 129s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 129s return func(*args, **kwargs) 129s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 129s return read_pandas( 129s reader, 129s ...<10 lines>... 129s **kwargs, 129s ) 129s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 129s raise OSError(f"{urlpath} resolved to no files") 129s OSError: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/cli/server/tests//entry1_*.csv resolved to no files 129s 129s The above exception was the direct cause of the following exception: 129s 129s Traceback (most recent call last): 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/cli/server/server.py", line 306, in post 129s source.discover() 129s ~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/base.py", line 347, in discover 129s self._load_metadata() 129s ~~~~~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/base.py", line 285, in _load_metadata 129s self._schema = self._get_schema() 129s ~~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/csv.py", line 115, in _get_schema 129s self._open_dataset(urlpath) 129s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/csv.py", line 94, in _open_dataset 129s self._dataframe = dask.dataframe.read_csv( 129s ~~~~~~~~~~~~~~~~~~~~~~~^ 129s urlpath, storage_options=self._storage_options, 129s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 129s **self._csv_kwargs) 129s ^^^^^^^^^^^^^^^^^^^ 129s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 129s raise exc from e 129s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 129s Original Message: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/cli/server/tests//entry1_*.csv resolved to no files 129s ------------------------------ Captured log call ------------------------------- 129s WARNING tornado.general:web.py:1932 400 POST /v1/source (127.0.0.1): Discover failed 129s WARNING tornado.access:web.py:2407 400 POST /v1/source (127.0.0.1) 3.32ms 129s ______________________ TestServerV1Source.test_no_format _______________________ 129s 129s self = 129s 129s def test_no_format(self): 129s msg = dict(action='open', name='entry1', parameters={}) 129s > resp_msg, = self.make_post_request(msg) 129s 129s intake/cli/server/tests/test_server.py:195: 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s intake/cli/server/tests/test_server.py:96: in make_post_request 129s self.assertEqual(response.code, expected_status) 129s E AssertionError: 400 != 200 129s ----------------------------- Captured stderr call ----------------------------- 129s Traceback (most recent call last): 129s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 129s return func(*args, **kwargs) 129s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 129s return read_pandas( 129s reader, 129s ...<10 lines>... 129s **kwargs, 129s ) 129s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 129s raise OSError(f"{urlpath} resolved to no files") 129s OSError: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/cli/server/tests//entry1_*.csv resolved to no files 129s 129s The above exception was the direct cause of the following exception: 129s 129s Traceback (most recent call last): 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/cli/server/server.py", line 306, in post 129s source.discover() 129s ~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/base.py", line 347, in discover 129s self._load_metadata() 129s ~~~~~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/base.py", line 285, in _load_metadata 129s self._schema = self._get_schema() 129s ~~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/csv.py", line 115, in _get_schema 129s self._open_dataset(urlpath) 129s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/csv.py", line 94, in _open_dataset 129s self._dataframe = dask.dataframe.read_csv( 129s ~~~~~~~~~~~~~~~~~~~~~~~^ 129s urlpath, storage_options=self._storage_options, 129s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 129s **self._csv_kwargs) 129s ^^^^^^^^^^^^^^^^^^^ 129s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 129s raise exc from e 129s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 129s Original Message: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/cli/server/tests//entry1_*.csv resolved to no files 129s ------------------------------ Captured log call ------------------------------- 129s WARNING tornado.general:web.py:1932 400 POST /v1/source (127.0.0.1): Discover failed 129s WARNING tornado.access:web.py:2407 400 POST /v1/source (127.0.0.1) 2.80ms 129s _________________________ TestServerV1Source.test_open _________________________ 129s 129s self = 129s 129s def test_open(self): 129s msg = dict(action='open', name='entry1', parameters={}) 129s > resp_msg, = self.make_post_request(msg) 129s 129s intake/cli/server/tests/test_server.py:112: 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s intake/cli/server/tests/test_server.py:96: in make_post_request 129s self.assertEqual(response.code, expected_status) 129s E AssertionError: 400 != 200 129s ----------------------------- Captured stderr call ----------------------------- 129s Traceback (most recent call last): 129s File "/usr/lib/python3/dist-packages/dask/backends.py", line 140, in wrapper 129s return func(*args, **kwargs) 129s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 877, in read 129s return read_pandas( 129s reader, 129s ...<10 lines>... 129s **kwargs, 129s ) 129s File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 644, in read_pandas 129s raise OSError(f"{urlpath} resolved to no files") 129s OSError: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/cli/server/tests//entry1_*.csv resolved to no files 129s 129s The above exception was the direct cause of the following exception: 129s 129s Traceback (most recent call last): 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/cli/server/server.py", line 306, in post 129s source.discover() 129s ~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/base.py", line 347, in discover 129s self._load_metadata() 129s ~~~~~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/base.py", line 285, in _load_metadata 129s self._schema = self._get_schema() 129s ~~~~~~~~~~~~~~~~^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/csv.py", line 115, in _get_schema 129s self._open_dataset(urlpath) 129s ~~~~~~~~~~~~~~~~~~^^^^^^^^^ 129s File "/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/csv.py", line 94, in _open_dataset 129s self._dataframe = dask.dataframe.read_csv( 129s ~~~~~~~~~~~~~~~~~~~~~~~^ 129s urlpath, storage_options=self._storage_options, 129s ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 129s **self._csv_kwargs) 129s ^^^^^^^^^^^^^^^^^^^ 129s File "/usr/lib/python3/dist-packages/dask/backends.py", line 151, in wrapper 129s raise exc from e 129s OSError: An error occurred while calling the read_csv method registered to the pandas backend. 129s Original Message: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/cli/server/tests//entry1_*.csv resolved to no files 129s ------------------------------ Captured log call ------------------------------- 129s WARNING tornado.general:web.py:1932 400 POST /v1/source (127.0.0.1): Discover failed 129s WARNING tornado.access:web.py:2407 400 POST /v1/source (127.0.0.1) 2.82ms 129s ________________________________ test_other_cat ________________________________ 129s 129s args = ('/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/tests/../../catalog/tests//entry1_*.csv',) 129s kwargs = {'storage_options': None} 129s func = .read at 0x7c2b51791080> 129s exc = OSError('An error occurred while calling the read_csv method registered to the pandas backend.\nOriginal Message: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/tests/../../catalog/tests//entry1_*.csv resolved to no files') 129s 129s @wraps(fn) 129s def wrapper(*args, **kwargs): 129s func = getattr(self, dispatch_name) 129s try: 129s > return func(*args, **kwargs) 129s 129s /usr/lib/python3/dist-packages/dask/backends.py:140: 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s /usr/lib/python3/dist-packages/dask/dataframe/io/csv.py:877: in read 129s return read_pandas( 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s 129s reader = 129s urlpath = '/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/tests/../../catalog/tests//entry1_*.csv' 129s blocksize = 'default', lineterminator = '\n', compression = 'infer' 129s sample = 256000, sample_rows = 10, enforce = False, assume_missing = False 129s storage_options = None, include_path_column = False, kwargs = {} 129s reader_name = 'read_csv', b_lineterminator = b'\n', kw = 'chunksize' 129s lastskiprow = 0, firstrow = 0 129s 129s def read_pandas( 129s reader, 129s urlpath, 129s blocksize="default", 129s lineterminator=None, 129s compression="infer", 129s sample=256000, 129s sample_rows=10, 129s enforce=False, 129s assume_missing=False, 129s storage_options=None, 129s include_path_column=False, 129s **kwargs, 129s ): 129s reader_name = reader.__name__ 129s if lineterminator is not None and len(lineterminator) == 1: 129s kwargs["lineterminator"] = lineterminator 129s else: 129s lineterminator = "\n" 129s if "encoding" in kwargs: 129s b_lineterminator = lineterminator.encode(kwargs["encoding"]) 129s empty_blob = "".encode(kwargs["encoding"]) 129s if empty_blob: 129s # This encoding starts with a Byte Order Mark (BOM), so strip that from the 129s # start of the line terminator, since this value is not a full file. 129s b_lineterminator = b_lineterminator[len(empty_blob) :] 129s else: 129s b_lineterminator = lineterminator.encode() 129s if include_path_column and isinstance(include_path_column, bool): 129s include_path_column = "path" 129s if "index" in kwargs or ( 129s "index_col" in kwargs and kwargs.get("index_col") is not False 129s ): 129s raise ValueError( 129s "Keywords 'index' and 'index_col' not supported, except for " 129s "'index_col=False'. Use dd.{reader_name}(...).set_index('my-index') instead" 129s ) 129s for kw in ["iterator", "chunksize"]: 129s if kw in kwargs: 129s raise ValueError(f"{kw} not supported for dd.{reader_name}") 129s if kwargs.get("nrows", None): 129s raise ValueError( 129s "The 'nrows' keyword is not supported by " 129s "`dd.{0}`. To achieve the same behavior, it's " 129s "recommended to use `dd.{0}(...)." 129s "head(n=nrows)`".format(reader_name) 129s ) 129s if isinstance(kwargs.get("skiprows"), int): 129s lastskiprow = firstrow = kwargs.get("skiprows") 129s elif kwargs.get("skiprows") is None: 129s lastskiprow = firstrow = 0 129s else: 129s # When skiprows is a list, we expect more than max(skiprows) to 129s # be included in the sample. This means that [0,2] will work well, 129s # but [0, 440] might not work. 129s skiprows = set(kwargs.get("skiprows")) 129s lastskiprow = max(skiprows) 129s # find the firstrow that is not skipped, for use as header 129s firstrow = min(set(range(len(skiprows) + 1)) - set(skiprows)) 129s if isinstance(kwargs.get("header"), list): 129s raise TypeError(f"List of header rows not supported for dd.{reader_name}") 129s if isinstance(kwargs.get("converters"), dict) and include_path_column: 129s path_converter = kwargs.get("converters").get(include_path_column, None) 129s else: 129s path_converter = None 129s 129s # If compression is "infer", inspect the (first) path suffix and 129s # set the proper compression option if the suffix is recognized. 129s if compression == "infer": 129s # Translate the input urlpath to a simple path list 129s paths = get_fs_token_paths(urlpath, mode="rb", storage_options=storage_options)[ 129s 2 129s ] 129s 129s # Check for at least one valid path 129s if len(paths) == 0: 129s > raise OSError(f"{urlpath} resolved to no files") 129s E OSError: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/tests/../../catalog/tests//entry1_*.csv resolved to no files 129s 129s /usr/lib/python3/dist-packages/dask/dataframe/io/csv.py:644: OSError 129s 129s The above exception was the direct cause of the following exception: 129s 129s def test_other_cat(): 129s cat = intake.open_catalog(catfile) 129s > df1 = cat.other_cat.read() 129s 129s intake/source/tests/test_derived.py:35: 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s intake/source/derived.py:252: in read 129s return self.to_dask().compute() 129s intake/source/derived.py:239: in to_dask 129s self._df = self._transform(self._source.to_dask(), 129s intake/source/csv.py:133: in to_dask 129s self._get_schema() 129s intake/source/csv.py:115: in _get_schema 129s self._open_dataset(urlpath) 129s intake/source/csv.py:94: in _open_dataset 129s self._dataframe = dask.dataframe.read_csv( 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s 129s args = ('/tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/tests/../../catalog/tests//entry1_*.csv',) 129s kwargs = {'storage_options': None} 129s func = .read at 0x7c2b51791080> 129s exc = OSError('An error occurred while calling the read_csv method registered to the pandas backend.\nOriginal Message: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/tests/../../catalog/tests//entry1_*.csv resolved to no files') 129s 129s @wraps(fn) 129s def wrapper(*args, **kwargs): 129s func = getattr(self, dispatch_name) 129s try: 129s return func(*args, **kwargs) 129s except Exception as e: 129s try: 129s exc = type(e)( 129s f"An error occurred while calling the {funcname(func)} " 129s f"method registered to the {self.backend} backend.\n" 129s f"Original Message: {e}" 129s ) 129s except TypeError: 129s raise e 129s else: 129s > raise exc from e 129s E OSError: An error occurred while calling the read_csv method registered to the pandas backend. 129s E Original Message: /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/tests/../../catalog/tests//entry1_*.csv resolved to no files 129s 129s /usr/lib/python3/dist-packages/dask/backends.py:151: OSError 129s ______________________________ test_text_persist _______________________________ 129s 129s temp_cache = None 129s 129s def test_text_persist(temp_cache): 129s cat = intake.open_catalog(os.path.join(here, 'sources.yaml')) 129s s = cat.sometext() 129s > s2 = s.persist() 129s 129s intake/source/tests/test_text.py:88: 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s intake/source/base.py:226: in persist 129s out = self._export(store.getdir(self), **kwargs) 129s intake/source/base.py:460: in _export 129s out = method(self, path=path, **kwargs) 129s intake/container/semistructured.py:70: in _persist 129s return RemoteSequenceSource._data_to_source(b, path, encoder, **kwargs) 129s intake/container/semistructured.py:90: in _data_to_source 129s files = open_files(posixpath.join(path, 'part.*'), mode='wt', 129s /usr/lib/python3/dist-packages/fsspec/core.py:295: in open_files 129s fs, fs_token, paths = get_fs_token_paths( 129s /usr/lib/python3/dist-packages/fsspec/core.py:684: in get_fs_token_paths 129s paths = _expand_paths(paths, name_function, num) 129s /usr/lib/python3/dist-packages/fsspec/core.py:701: in _expand_paths 129s name_function = build_name_function(num - 1) 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s 129s max_int = -0.99999999 129s 129s def build_name_function(max_int: float) -> Callable[[int], str]: 129s """Returns a function that receives a single integer 129s and returns it as a string padded by enough zero characters 129s to align with maximum possible integer 129s 129s >>> name_f = build_name_function(57) 129s 129s >>> name_f(7) 129s '07' 129s >>> name_f(31) 129s '31' 129s >>> build_name_function(1000)(42) 129s '0042' 129s >>> build_name_function(999)(42) 129s '042' 129s >>> build_name_function(0)(0) 129s '0' 129s """ 129s # handle corner cases max_int is 0 or exact power of 10 129s max_int += 1e-8 129s 129s > pad_length = int(math.ceil(math.log10(max_int))) 129s E ValueError: math domain error 129s 129s /usr/lib/python3/dist-packages/fsspec/utils.py:177: ValueError 129s _______________________________ test_text_export _______________________________ 129s 129s temp_cache = None 129s 129s def test_text_export(temp_cache): 129s import tempfile 129s outdir = tempfile.mkdtemp() 129s cat = intake.open_catalog(os.path.join(here, 'sources.yaml')) 129s s = cat.sometext() 129s > out = s.export(outdir) 129s 129s intake/source/tests/test_text.py:97: 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s intake/source/base.py:452: in export 129s return self._export(path, **kwargs) 129s intake/source/base.py:460: in _export 129s out = method(self, path=path, **kwargs) 129s intake/container/semistructured.py:70: in _persist 129s return RemoteSequenceSource._data_to_source(b, path, encoder, **kwargs) 129s intake/container/semistructured.py:90: in _data_to_source 129s files = open_files(posixpath.join(path, 'part.*'), mode='wt', 129s /usr/lib/python3/dist-packages/fsspec/core.py:295: in open_files 129s fs, fs_token, paths = get_fs_token_paths( 129s /usr/lib/python3/dist-packages/fsspec/core.py:684: in get_fs_token_paths 129s paths = _expand_paths(paths, name_function, num) 129s /usr/lib/python3/dist-packages/fsspec/core.py:701: in _expand_paths 129s name_function = build_name_function(num - 1) 129s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 129s 129s max_int = -0.99999999 129s 129s def build_name_function(max_int: float) -> Callable[[int], str]: 129s """Returns a function that receives a single integer 129s and returns it as a string padded by enough zero characters 129s to align with maximum possible integer 129s 129s >>> name_f = build_name_function(57) 129s 129s >>> name_f(7) 129s '07' 129s >>> name_f(31) 129s '31' 129s >>> build_name_function(1000)(42) 129s '0042' 129s >>> build_name_function(999)(42) 129s '042' 129s >>> build_name_function(0)(0) 129s '0' 129s """ 129s # handle corner cases max_int is 0 or exact power of 10 129s max_int += 1e-8 129s 129s > pad_length = int(math.ceil(math.log10(max_int))) 129s E ValueError: math domain error 129s 129s /usr/lib/python3/dist-packages/fsspec/utils.py:177: ValueError 129s =============================== warnings summary =============================== 129s intake/catalog/tests/test_alias.py::test_simple 129s /usr/lib/python3/dist-packages/dask/dataframe/__init__.py:49: FutureWarning: 129s Dask dataframe query planning is disabled because dask-expr is not installed. 129s 129s You can install it with `pip install dask[dataframe]` or `conda install dask`. 129s This will raise in a future version. 129s 129s warnings.warn(msg, FutureWarning) 129s 129s intake/source/tests/test_cache.py::test_filtered_compressed_cache 129s intake/source/tests/test_cache.py::test_compressions[tgz] 129s intake/source/tests/test_cache.py::test_compressions[tgz] 129s /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/decompress.py:27: DeprecationWarning: Python 3.14 will, by default, filter extracted tar archives and reject files or modify their metadata. Use the filter argument to control this behavior. 129s tar.extractall(outpath) 129s 129s intake/source/tests/test_cache.py::test_compressions[tbz] 129s intake/source/tests/test_cache.py::test_compressions[tbz] 129s /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/decompress.py:37: DeprecationWarning: Python 3.14 will, by default, filter extracted tar archives and reject files or modify their metadata. Use the filter argument to control this behavior. 129s tar.extractall(outpath) 129s 129s intake/source/tests/test_cache.py::test_compressions[tar] 129s intake/source/tests/test_cache.py::test_compressions[tar] 129s /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/decompress.py:47: DeprecationWarning: Python 3.14 will, by default, filter extracted tar archives and reject files or modify their metadata. Use the filter argument to control this behavior. 129s tar.extractall(outpath) 129s 129s intake/source/tests/test_discovery.py::test_package_scan 129s intake/source/tests/test_discovery.py::test_package_scan 129s intake/source/tests/test_discovery.py::test_enable_and_disable 129s intake/source/tests/test_discovery.py::test_discover_collision 129s /tmp/autopkgtest.zAAzqt/build.B3R/src/intake/source/discovery.py:194: PendingDeprecationWarning: Package scanning may be removed 129s warnings.warn("Package scanning may be removed", category=PendingDeprecationWarning) 129s 129s -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html 129s =========================== short test summary info ============================ 129s FAILED intake/catalog/tests/test_caching_integration.py::test_load_textfile 129s FAILED intake/catalog/tests/test_local.py::test_nested - OSError: An error oc... 129s FAILED intake/catalog/tests/test_remote_integration.py::test_info_describe - ... 129s FAILED intake/catalog/tests/test_remote_integration.py::test_remote_direct - ... 129s FAILED intake/catalog/tests/test_remote_integration.py::test_remote_datasource_interface 129s FAILED intake/catalog/tests/test_remote_integration.py::test_read - Exception... 129s FAILED intake/catalog/tests/test_remote_integration.py::test_read_chunks - Ex... 129s FAILED intake/catalog/tests/test_remote_integration.py::test_read_partition 129s FAILED intake/catalog/tests/test_remote_integration.py::test_close - Exceptio... 129s FAILED intake/catalog/tests/test_remote_integration.py::test_with - Exception... 129s FAILED intake/catalog/tests/test_remote_integration.py::test_pickle - Excepti... 129s FAILED intake/catalog/tests/test_remote_integration.py::test_to_dask - Except... 129s FAILED intake/catalog/tests/test_remote_integration.py::test_remote_sequence 129s FAILED intake/catalog/tests/test_remote_integration.py::test_dir - TypeError:... 129s FAILED intake/cli/client/tests/test_local_integration.py::test_discover - ass... 129s FAILED intake/cli/client/tests/test_local_integration.py::test_get_pass - Ass... 129s FAILED intake/cli/server/tests/test_server.py::TestServerV1Source::test_idle_timer 129s FAILED intake/cli/server/tests/test_server.py::TestServerV1Source::test_no_format 129s FAILED intake/cli/server/tests/test_server.py::TestServerV1Source::test_open 129s FAILED intake/source/tests/test_derived.py::test_other_cat - OSError: An erro... 129s FAILED intake/source/tests/test_text.py::test_text_persist - ValueError: math... 129s FAILED intake/source/tests/test_text.py::test_text_export - ValueError: math ... 129s ERROR intake/interface/tests/test_init_gui.py::test_no_panel_does_not_raise_errors 129s ERROR intake/interface/tests/test_init_gui.py::test_no_panel_display_init_gui 129s ERROR intake/interface/tests/test_init_gui.py::test_display_init_gui - KeyErr... 129s ====== 22 failed, 379 passed, 31 skipped, 12 warnings, 3 errors in 32.74s ====== 130s autopkgtest [16:50:41]: test run-unit-test: -----------------------] 130s run-unit-test FAIL non-zero exit status 1 130s autopkgtest [16:50:41]: test run-unit-test: - - - - - - - - - - results - - - - - - - - - - 130s autopkgtest [16:50:41]: @@@@@@@@@@@@@@@@@@@@ summary 130s run-unit-test FAIL non-zero exit status 1