0s autopkgtest [23:49:54]: starting date and time: 2025-07-03 23:49:54+0000 0s autopkgtest [23:49:54]: git checkout: 508d4a25 a-v-ssh wait_for_ssh: demote "ssh connection failed" to a debug message 0s autopkgtest [23:49:54]: host juju-7f2275-prod-proposed-migration-environment-9; command line: /home/ubuntu/autopkgtest/runner/autopkgtest --output-dir /tmp/autopkgtest-work.mgazmpho/out --timeout-copy=6000 --setup-commands 'ln -s /dev/null /etc/systemd/system/bluetooth.service; printf "http_proxy=http://squid.internal:3128\nhttps_proxy=http://squid.internal:3128\nno_proxy=127.0.0.1,127.0.1.1,localhost,localdomain,internal,login.ubuntu.com,archive.ubuntu.com,ports.ubuntu.com,security.ubuntu.com,ddebs.ubuntu.com,changelogs.ubuntu.com,keyserver.ubuntu.com,launchpadlibrarian.net,launchpadcontent.net,launchpad.net,keystone.ps5.canonical.com,objectstorage.prodstack5.canonical.com,radosgw.ps5.canonical.com\n" >> /etc/environment' --apt-pocket=proposed=src:systemd,src:netplan.io,src:openssh,src:samba --apt-upgrade valkey --timeout-short=300 --timeout-copy=20000 --timeout-build=20000 '--env=ADT_TEST_TRIGGERS=systemd/255.4-1ubuntu8.10 netplan.io/1.1.2-2~ubuntu24.04.2 openssh/1:9.6p1-3ubuntu13.13 samba/2:4.19.5+dfsg-4ubuntu9.2' -- lxd -r lxd-armhf-10.145.243.229 lxd-armhf-10.145.243.229:autopkgtest/ubuntu/noble/armhf 26s autopkgtest [23:50:20]: testbed dpkg architecture: armhf 27s autopkgtest [23:50:21]: testbed apt version: 2.8.3 31s autopkgtest [23:50:25]: @@@@@@@@@@@@@@@@@@@@ test bed setup 33s autopkgtest [23:50:27]: testbed release detected to be: None 40s autopkgtest [23:50:34]: updating testbed package index (apt update) 42s Get:1 http://ftpmaster.internal/ubuntu noble-proposed InRelease [265 kB] 42s Hit:2 http://ftpmaster.internal/ubuntu noble InRelease 42s Get:3 http://ftpmaster.internal/ubuntu noble-updates InRelease [126 kB] 43s Get:4 http://ftpmaster.internal/ubuntu noble-security InRelease [126 kB] 43s Get:5 http://ftpmaster.internal/ubuntu noble-proposed/main Sources [64.5 kB] 43s Get:6 http://ftpmaster.internal/ubuntu noble-proposed/restricted Sources [28.9 kB] 43s Get:7 http://ftpmaster.internal/ubuntu noble-proposed/universe Sources [63.8 kB] 43s Get:8 http://ftpmaster.internal/ubuntu noble-proposed/multiverse Sources [3948 B] 43s Get:9 http://ftpmaster.internal/ubuntu noble-proposed/main armhf Packages [98.9 kB] 43s Get:10 http://ftpmaster.internal/ubuntu noble-proposed/main armhf c-n-f Metadata [2252 B] 43s Get:11 http://ftpmaster.internal/ubuntu noble-proposed/restricted armhf Packages [2720 B] 43s Get:12 http://ftpmaster.internal/ubuntu noble-proposed/restricted armhf c-n-f Metadata [116 B] 43s Get:13 http://ftpmaster.internal/ubuntu noble-proposed/universe armhf Packages [276 kB] 43s Get:14 http://ftpmaster.internal/ubuntu noble-proposed/universe armhf c-n-f Metadata [2608 B] 43s Get:15 http://ftpmaster.internal/ubuntu noble-proposed/multiverse armhf Packages [752 B] 43s Get:16 http://ftpmaster.internal/ubuntu noble-proposed/multiverse armhf c-n-f Metadata [116 B] 43s Get:17 http://ftpmaster.internal/ubuntu noble-updates/multiverse Sources [16.0 kB] 43s Get:18 http://ftpmaster.internal/ubuntu noble-updates/main Sources [429 kB] 43s Get:19 http://ftpmaster.internal/ubuntu noble-updates/universe Sources [441 kB] 43s Get:20 http://ftpmaster.internal/ubuntu noble-updates/restricted Sources [44.7 kB] 43s Get:21 http://ftpmaster.internal/ubuntu noble-updates/main armhf Packages [606 kB] 43s Get:22 http://ftpmaster.internal/ubuntu noble-updates/universe armhf Packages [861 kB] 43s Get:23 http://ftpmaster.internal/ubuntu noble-updates/multiverse armhf Packages [2964 B] 43s Get:24 http://ftpmaster.internal/ubuntu noble-security/restricted Sources [41.5 kB] 43s Get:25 http://ftpmaster.internal/ubuntu noble-security/universe Sources [314 kB] 43s Get:26 http://ftpmaster.internal/ubuntu noble-security/main Sources [189 kB] 43s Get:27 http://ftpmaster.internal/ubuntu noble-security/multiverse Sources [10.2 kB] 43s Get:28 http://ftpmaster.internal/ubuntu noble-security/main armhf Packages [374 kB] 43s Get:29 http://ftpmaster.internal/ubuntu noble-security/universe armhf Packages [641 kB] 43s Get:30 http://ftpmaster.internal/ubuntu noble-security/multiverse armhf Packages [2228 B] 45s Fetched 5035 kB in 2s (3015 kB/s) 46s Reading package lists... 53s autopkgtest [23:50:47]: upgrading testbed (apt dist-upgrade and autopurge) 55s Reading package lists... 55s Building dependency tree... 55s Reading state information... 55s Calculating upgrade...Starting pkgProblemResolver with broken count: 0 55s Starting 2 pkgProblemResolver with broken count: 0 55s Done 56s Entering ResolveByKeep 56s 57s The following packages were automatically installed and are no longer required: 57s linux-headers-6.8.0-62 linux-headers-6.8.0-62-generic 57s Use 'apt autoremove' to remove them. 57s The following NEW packages will be installed: 57s linux-headers-6.8.0-63 linux-headers-6.8.0-63-generic 57s The following packages will be upgraded: 57s fwupd gzip libfwupd2 libnetplan1 libnss-systemd libpam-systemd 57s libsystemd-shared libsystemd0 libudev1 linux-headers-generic 57s netplan-generator netplan.io openssh-client openssh-server 57s openssh-sftp-server python3-netplan sudo systemd systemd-dev 57s systemd-resolved systemd-sysv systemd-timesyncd udev 57s 23 upgraded, 2 newly installed, 0 to remove and 0 not upgraded. 57s Need to get 31.5 MB of archives. 57s After this operation, 92.6 MB of additional disk space will be used. 57s Get:1 http://ftpmaster.internal/ubuntu noble-updates/main armhf gzip armhf 1.12-1ubuntu3.1 [96.0 kB] 57s Get:2 http://ftpmaster.internal/ubuntu noble-proposed/main armhf libnss-systemd armhf 255.4-1ubuntu8.10 [148 kB] 58s Get:3 http://ftpmaster.internal/ubuntu noble-proposed/main armhf systemd-dev all 255.4-1ubuntu8.10 [105 kB] 58s Get:4 http://ftpmaster.internal/ubuntu noble-proposed/main armhf systemd-timesyncd armhf 255.4-1ubuntu8.10 [36.0 kB] 58s Get:5 http://ftpmaster.internal/ubuntu noble-proposed/main armhf systemd-resolved armhf 255.4-1ubuntu8.10 [289 kB] 58s Get:6 http://ftpmaster.internal/ubuntu noble-proposed/main armhf libsystemd-shared armhf 255.4-1ubuntu8.10 [2013 kB] 58s Get:7 http://ftpmaster.internal/ubuntu noble-proposed/main armhf libsystemd0 armhf 255.4-1ubuntu8.10 [408 kB] 58s Get:8 http://ftpmaster.internal/ubuntu noble-proposed/main armhf systemd-sysv armhf 255.4-1ubuntu8.10 [11.9 kB] 58s Get:9 http://ftpmaster.internal/ubuntu noble-proposed/main armhf libpam-systemd armhf 255.4-1ubuntu8.10 [216 kB] 58s Get:10 http://ftpmaster.internal/ubuntu noble-proposed/main armhf systemd armhf 255.4-1ubuntu8.10 [3506 kB] 59s Get:11 http://ftpmaster.internal/ubuntu noble-proposed/main armhf udev armhf 255.4-1ubuntu8.10 [1852 kB] 59s Get:12 http://ftpmaster.internal/ubuntu noble-proposed/main armhf libudev1 armhf 255.4-1ubuntu8.10 [168 kB] 60s Get:13 http://ftpmaster.internal/ubuntu noble-proposed/main armhf openssh-sftp-server armhf 1:9.6p1-3ubuntu13.13 [35.5 kB] 60s Get:14 http://ftpmaster.internal/ubuntu noble-proposed/main armhf openssh-server armhf 1:9.6p1-3ubuntu13.13 [505 kB] 60s Get:15 http://ftpmaster.internal/ubuntu noble-proposed/main armhf openssh-client armhf 1:9.6p1-3ubuntu13.13 [891 kB] 60s Get:16 http://ftpmaster.internal/ubuntu noble-proposed/main armhf python3-netplan armhf 1.1.2-2~ubuntu24.04.2 [24.1 kB] 60s Get:17 http://ftpmaster.internal/ubuntu noble-proposed/main armhf netplan-generator armhf 1.1.2-2~ubuntu24.04.2 [60.7 kB] 60s Get:18 http://ftpmaster.internal/ubuntu noble-proposed/main armhf netplan.io armhf 1.1.2-2~ubuntu24.04.2 [68.7 kB] 60s Get:19 http://ftpmaster.internal/ubuntu noble-proposed/main armhf libnetplan1 armhf 1.1.2-2~ubuntu24.04.2 [123 kB] 60s Get:20 http://ftpmaster.internal/ubuntu noble-updates/main armhf sudo armhf 1.9.15p5-3ubuntu5.24.04.1 [937 kB] 60s Get:21 http://ftpmaster.internal/ubuntu noble-updates/main armhf libfwupd2 armhf 1.9.30-0ubuntu1~24.04.1 [126 kB] 60s Get:22 http://ftpmaster.internal/ubuntu noble-updates/main armhf fwupd armhf 1.9.30-0ubuntu1~24.04.1 [4410 kB] 61s Get:23 http://ftpmaster.internal/ubuntu noble-updates/main armhf linux-headers-6.8.0-63 all 6.8.0-63.66 [13.9 MB] 63s Get:24 http://ftpmaster.internal/ubuntu noble-updates/main armhf linux-headers-6.8.0-63-generic armhf 6.8.0-63.66 [1570 kB] 64s Get:25 http://ftpmaster.internal/ubuntu noble-updates/main armhf linux-headers-generic armhf 6.8.0-63.66 [10.5 kB] 64s Preconfiguring packages ... 64s Fetched 31.5 MB in 7s (4683 kB/s) 64s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 58042 files and directories currently installed.) 64s Preparing to unpack .../gzip_1.12-1ubuntu3.1_armhf.deb ... 64s Unpacking gzip (1.12-1ubuntu3.1) over (1.12-1ubuntu3) ... 64s Setting up gzip (1.12-1ubuntu3.1) ... 64s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 58042 files and directories currently installed.) 64s Preparing to unpack .../0-libnss-systemd_255.4-1ubuntu8.10_armhf.deb ... 64s Unpacking libnss-systemd:armhf (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 65s Preparing to unpack .../1-systemd-dev_255.4-1ubuntu8.10_all.deb ... 65s Unpacking systemd-dev (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 65s Preparing to unpack .../2-systemd-timesyncd_255.4-1ubuntu8.10_armhf.deb ... 65s Unpacking systemd-timesyncd (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 65s Preparing to unpack .../3-systemd-resolved_255.4-1ubuntu8.10_armhf.deb ... 65s Unpacking systemd-resolved (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 65s Preparing to unpack .../4-libsystemd-shared_255.4-1ubuntu8.10_armhf.deb ... 65s Unpacking libsystemd-shared:armhf (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 65s Preparing to unpack .../5-libsystemd0_255.4-1ubuntu8.10_armhf.deb ... 65s Unpacking libsystemd0:armhf (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 65s Setting up libsystemd0:armhf (255.4-1ubuntu8.10) ... 65s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 58042 files and directories currently installed.) 65s Preparing to unpack .../systemd-sysv_255.4-1ubuntu8.10_armhf.deb ... 65s Unpacking systemd-sysv (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 65s Preparing to unpack .../libpam-systemd_255.4-1ubuntu8.10_armhf.deb ... 65s Unpacking libpam-systemd:armhf (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 65s Preparing to unpack .../systemd_255.4-1ubuntu8.10_armhf.deb ... 65s Unpacking systemd (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 65s Preparing to unpack .../udev_255.4-1ubuntu8.10_armhf.deb ... 65s Unpacking udev (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 65s Preparing to unpack .../libudev1_255.4-1ubuntu8.10_armhf.deb ... 65s Unpacking libudev1:armhf (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 65s Setting up libudev1:armhf (255.4-1ubuntu8.10) ... 65s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 58042 files and directories currently installed.) 65s Preparing to unpack .../00-openssh-sftp-server_1%3a9.6p1-3ubuntu13.13_armhf.deb ... 65s Unpacking openssh-sftp-server (1:9.6p1-3ubuntu13.13) over (1:9.6p1-3ubuntu13.12) ... 66s Preparing to unpack .../01-openssh-server_1%3a9.6p1-3ubuntu13.13_armhf.deb ... 66s Unpacking openssh-server (1:9.6p1-3ubuntu13.13) over (1:9.6p1-3ubuntu13.12) ... 66s Preparing to unpack .../02-openssh-client_1%3a9.6p1-3ubuntu13.13_armhf.deb ... 66s Unpacking openssh-client (1:9.6p1-3ubuntu13.13) over (1:9.6p1-3ubuntu13.12) ... 66s Preparing to unpack .../03-python3-netplan_1.1.2-2~ubuntu24.04.2_armhf.deb ... 66s Unpacking python3-netplan (1.1.2-2~ubuntu24.04.2) over (1.1.2-2~ubuntu24.04.1) ... 66s Preparing to unpack .../04-netplan-generator_1.1.2-2~ubuntu24.04.2_armhf.deb ... 66s Adding 'diversion of /lib/systemd/system-generators/netplan to /lib/systemd/system-generators/netplan.usr-is-merged by netplan-generator' 66s Unpacking netplan-generator (1.1.2-2~ubuntu24.04.2) over (1.1.2-2~ubuntu24.04.1) ... 66s Preparing to unpack .../05-netplan.io_1.1.2-2~ubuntu24.04.2_armhf.deb ... 66s Unpacking netplan.io (1.1.2-2~ubuntu24.04.2) over (1.1.2-2~ubuntu24.04.1) ... 66s Preparing to unpack .../06-libnetplan1_1.1.2-2~ubuntu24.04.2_armhf.deb ... 66s Unpacking libnetplan1:armhf (1.1.2-2~ubuntu24.04.2) over (1.1.2-2~ubuntu24.04.1) ... 66s Preparing to unpack .../07-sudo_1.9.15p5-3ubuntu5.24.04.1_armhf.deb ... 66s Unpacking sudo (1.9.15p5-3ubuntu5.24.04.1) over (1.9.15p5-3ubuntu5) ... 66s Preparing to unpack .../08-libfwupd2_1.9.30-0ubuntu1~24.04.1_armhf.deb ... 66s Unpacking libfwupd2:armhf (1.9.30-0ubuntu1~24.04.1) over (1.9.29-0ubuntu1~24.04.1ubuntu1) ... 66s Preparing to unpack .../09-fwupd_1.9.30-0ubuntu1~24.04.1_armhf.deb ... 66s Unpacking fwupd (1.9.30-0ubuntu1~24.04.1) over (1.9.29-0ubuntu1~24.04.1ubuntu1) ... 67s Selecting previously unselected package linux-headers-6.8.0-63. 67s Preparing to unpack .../10-linux-headers-6.8.0-63_6.8.0-63.66_all.deb ... 67s Unpacking linux-headers-6.8.0-63 (6.8.0-63.66) ... 69s Selecting previously unselected package linux-headers-6.8.0-63-generic. 69s Preparing to unpack .../11-linux-headers-6.8.0-63-generic_6.8.0-63.66_armhf.deb ... 69s Unpacking linux-headers-6.8.0-63-generic (6.8.0-63.66) ... 71s Preparing to unpack .../12-linux-headers-generic_6.8.0-63.66_armhf.deb ... 71s Unpacking linux-headers-generic (6.8.0-63.66) over (6.8.0-62.65) ... 71s Setting up linux-headers-6.8.0-63 (6.8.0-63.66) ... 71s Setting up openssh-client (1:9.6p1-3ubuntu13.13) ... 71s Setting up libfwupd2:armhf (1.9.30-0ubuntu1~24.04.1) ... 71s Setting up systemd-dev (255.4-1ubuntu8.10) ... 71s Setting up libnetplan1:armhf (1.1.2-2~ubuntu24.04.2) ... 71s Setting up libsystemd-shared:armhf (255.4-1ubuntu8.10) ... 71s Setting up sudo (1.9.15p5-3ubuntu5.24.04.1) ... 71s Setting up linux-headers-6.8.0-63-generic (6.8.0-63.66) ... 71s Setting up python3-netplan (1.1.2-2~ubuntu24.04.2) ... 71s Setting up openssh-sftp-server (1:9.6p1-3ubuntu13.13) ... 71s Setting up openssh-server (1:9.6p1-3ubuntu13.13) ... 72s Setting up systemd (255.4-1ubuntu8.10) ... 72s Setting up linux-headers-generic (6.8.0-63.66) ... 72s Setting up systemd-timesyncd (255.4-1ubuntu8.10) ... 73s Setting up udev (255.4-1ubuntu8.10) ... 74s Setting up netplan-generator (1.1.2-2~ubuntu24.04.2) ... 74s Removing 'diversion of /lib/systemd/system-generators/netplan to /lib/systemd/system-generators/netplan.usr-is-merged by netplan-generator' 74s Setting up fwupd (1.9.30-0ubuntu1~24.04.1) ... 74s fwupd-offline-update.service is a disabled or a static unit not running, not starting it. 74s fwupd-refresh.service is a disabled or a static unit not running, not starting it. 74s fwupd.service is a disabled or a static unit not running, not starting it. 74s Setting up systemd-resolved (255.4-1ubuntu8.10) ... 75s Setting up systemd-sysv (255.4-1ubuntu8.10) ... 75s Setting up libnss-systemd:armhf (255.4-1ubuntu8.10) ... 75s Setting up netplan.io (1.1.2-2~ubuntu24.04.2) ... 75s Setting up libpam-systemd:armhf (255.4-1ubuntu8.10) ... 75s Processing triggers for initramfs-tools (0.142ubuntu25.5) ... 75s Processing triggers for libc-bin (2.39-0ubuntu8.4) ... 75s Processing triggers for ufw (0.36.2-6) ... 75s Processing triggers for man-db (2.12.0-4build2) ... 76s Processing triggers for dbus (1.14.10-4ubuntu4.1) ... 76s Processing triggers for install-info (7.1-3build2) ... 79s Reading package lists... 79s Building dependency tree... 79s Reading state information... 79s Starting pkgProblemResolver with broken count: 0 79s Starting 2 pkgProblemResolver with broken count: 0 79s Done 80s The following packages will be REMOVED: 80s linux-headers-6.8.0-62* linux-headers-6.8.0-62-generic* 80s 0 upgraded, 0 newly installed, 2 to remove and 0 not upgraded. 80s After this operation, 92.5 MB disk space will be freed. 80s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 89198 files and directories currently installed.) 80s Removing linux-headers-6.8.0-62-generic (6.8.0-62.65) ... 81s Removing linux-headers-6.8.0-62 (6.8.0-62.65) ... 84s autopkgtest [23:51:18]: rebooting testbed after setup commands that affected boot 123s autopkgtest [23:51:57]: testbed running kernel: Linux 6.8.0-58-generic #60~22.04.1-Ubuntu SMP PREEMPT_DYNAMIC Fri Mar 28 14:48:37 UTC 2 146s autopkgtest [23:52:20]: @@@@@@@@@@@@@@@@@@@@ apt-source valkey 167s Get:1 http://ftpmaster.internal/ubuntu noble-updates/universe valkey 7.2.8+dfsg1-0ubuntu0.24.04.2 (dsc) [2134 B] 167s Get:2 http://ftpmaster.internal/ubuntu noble-updates/universe valkey 7.2.8+dfsg1-0ubuntu0.24.04.2 (tar) [2470 kB] 167s Get:3 http://ftpmaster.internal/ubuntu noble-updates/universe valkey 7.2.8+dfsg1-0ubuntu0.24.04.2 (diff) [18.2 kB] 167s gpgv: Signature made Mon Mar 3 15:59:36 2025 UTC 167s gpgv: using RSA key 38C77D33856973A58762FBFE401EFCBCDA0FF1BD 167s gpgv: Can't check signature: No public key 167s dpkg-source: warning: cannot verify inline signature for ./valkey_7.2.8+dfsg1-0ubuntu0.24.04.2.dsc: no acceptable signature found 167s autopkgtest [23:52:41]: testing package valkey version 7.2.8+dfsg1-0ubuntu0.24.04.2 170s autopkgtest [23:52:44]: build not needed 174s autopkgtest [23:52:48]: test 0001-valkey-cli: preparing testbed 176s Reading package lists... 176s Building dependency tree... 176s Reading state information... 176s Starting pkgProblemResolver with broken count: 0 176s Starting 2 pkgProblemResolver with broken count: 0 176s Done 177s The following NEW packages will be installed: 177s libatomic1 libjemalloc2 liblzf1 valkey-server valkey-tools 177s 0 upgraded, 5 newly installed, 0 to remove and 0 not upgraded. 177s Need to get 1380 kB of archives. 177s After this operation, 5032 kB of additional disk space will be used. 177s Get:1 http://ftpmaster.internal/ubuntu noble-updates/main armhf libatomic1 armhf 14.2.0-4ubuntu2~24.04 [7888 B] 177s Get:2 http://ftpmaster.internal/ubuntu noble/universe armhf libjemalloc2 armhf 5.3.0-2build1 [200 kB] 177s Get:3 http://ftpmaster.internal/ubuntu noble/universe armhf liblzf1 armhf 3.6-4 [6554 B] 177s Get:4 http://ftpmaster.internal/ubuntu noble-updates/universe armhf valkey-tools armhf 7.2.8+dfsg1-0ubuntu0.24.04.2 [1116 kB] 178s Get:5 http://ftpmaster.internal/ubuntu noble-updates/universe armhf valkey-server armhf 7.2.8+dfsg1-0ubuntu0.24.04.2 [49.3 kB] 178s Fetched 1380 kB in 1s (2047 kB/s) 178s Selecting previously unselected package libatomic1:armhf. 178s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 58042 files and directories currently installed.) 178s Preparing to unpack .../libatomic1_14.2.0-4ubuntu2~24.04_armhf.deb ... 178s Unpacking libatomic1:armhf (14.2.0-4ubuntu2~24.04) ... 178s Selecting previously unselected package libjemalloc2:armhf. 178s Preparing to unpack .../libjemalloc2_5.3.0-2build1_armhf.deb ... 178s Unpacking libjemalloc2:armhf (5.3.0-2build1) ... 178s Selecting previously unselected package liblzf1:armhf. 178s Preparing to unpack .../liblzf1_3.6-4_armhf.deb ... 178s Unpacking liblzf1:armhf (3.6-4) ... 178s Selecting previously unselected package valkey-tools. 178s Preparing to unpack .../valkey-tools_7.2.8+dfsg1-0ubuntu0.24.04.2_armhf.deb ... 178s Unpacking valkey-tools (7.2.8+dfsg1-0ubuntu0.24.04.2) ... 178s Selecting previously unselected package valkey-server. 178s Preparing to unpack .../valkey-server_7.2.8+dfsg1-0ubuntu0.24.04.2_armhf.deb ... 178s Unpacking valkey-server (7.2.8+dfsg1-0ubuntu0.24.04.2) ... 178s Setting up libjemalloc2:armhf (5.3.0-2build1) ... 178s Setting up liblzf1:armhf (3.6-4) ... 178s Setting up libatomic1:armhf (14.2.0-4ubuntu2~24.04) ... 178s Setting up valkey-tools (7.2.8+dfsg1-0ubuntu0.24.04.2) ... 178s Setting up valkey-server (7.2.8+dfsg1-0ubuntu0.24.04.2) ... 179s Created symlink /etc/systemd/system/valkey.service → /usr/lib/systemd/system/valkey-server.service. 179s Created symlink /etc/systemd/system/multi-user.target.wants/valkey-server.service → /usr/lib/systemd/system/valkey-server.service. 179s Processing triggers for man-db (2.12.0-4build2) ... 180s Processing triggers for libc-bin (2.39-0ubuntu8.4) ... 187s autopkgtest [23:53:01]: test 0001-valkey-cli: [----------------------- 194s # Server 194s redis_version:7.2.4 194s server_name:valkey 194s valkey_version:7.2.8 194s redis_git_sha1:00000000 194s redis_git_dirty:0 194s redis_build_id:c153bd6b3f23fc46 194s redis_mode:standalone 194s os:Linux 6.8.0-58-generic armv7l 194s arch_bits:32 194s monotonic_clock:POSIX clock_gettime 194s multiplexing_api:epoll 194s atomicvar_api:c11-builtin 194s gcc_version:13.3.0 194s process_id:1036 194s process_supervised:systemd 194s run_id:be164a39e8314ffc6986b582ffffb5c0e04a7ef3 194s tcp_port:6379 194s server_time_usec:1751586788629688 194s uptime_in_seconds:5 194s uptime_in_days:0 194s hz:10 194s configured_hz:10 194s lru_clock:6756324 194s executable:/usr/bin/valkey-server 194s config_file:/etc/valkey/valkey.conf 194s io_threads_active:0 194s listener0:name=tcp,bind=127.0.0.1,bind=-::1,port=6379 194s 194s # Clients 194s connected_clients:1 194s cluster_connections:0 194s maxclients:10000 194s client_recent_max_input_buffer:0 194s client_recent_max_output_buffer:0 194s blocked_clients:0 194s tracking_clients:0 194s clients_in_timeout_table:0 194s total_blocking_keys:0 194s total_blocking_keys_on_nokey:0 194s 194s # Memory 194s used_memory:761872 194s used_memory_human:744.02K 194s used_memory_rss:9437184 194s used_memory_rss_human:9.00M 194s used_memory_peak:761872 194s used_memory_peak_human:744.02K 194s used_memory_peak_perc:103.12% 194s used_memory_overhead:721568 194s used_memory_startup:721448 194s used_memory_dataset:40304 194s used_memory_dataset_perc:99.70% 194s allocator_allocated:3823200 194s allocator_active:9371648 194s allocator_resident:10158080 194s total_system_memory:3844046848 194s total_system_memory_human:3.58G 194s used_memory_lua:22528 194s used_memory_vm_eval:22528 194s used_memory_lua_human:22.00K 194s used_memory_scripts_eval:0 194s number_of_cached_scripts:0 194s number_of_functions:0 194s number_of_libraries:0 194s used_memory_vm_functions:24576 194s used_memory_vm_total:47104 194s used_memory_vm_total_human:46.00K 194s used_memory_functions:120 194s used_memory_scripts:120 194s used_memory_scripts_human:120B 194s maxmemory:3221225472 194s maxmemory_human:3.00G 194s maxmemory_policy:noeviction 194s allocator_frag_ratio:2.45 194s allocator_frag_bytes:5548448 194s allocator_rss_ratio:1.08 194s allocator_rss_bytes:786432 194s rss_overhead_ratio:0.93 194s rss_overhead_bytes:-720896 194s mem_fragmentation_ratio:13.08 194s mem_fragmentation_bytes:8715648 194s mem_not_counted_for_evict:0 194s mem_replication_backlog:0 194s mem_total_replication_buffers:0 194s mem_clients_slaves:0 194s mem_clients_normal:0 194s mem_cluster_links:0 194s mem_aof_buffer:0 194s mem_allocator:jemalloc-5.3.0 194s active_defrag_running:0 194s lazyfree_pending_objects:0 194s lazyfreed_objects:0 194s 194s # Persistence 194s loading:0 194s async_loading:0 194s current_cow_peak:0 194s current_cow_size:0 194s current_cow_size_age:0 194s current_fork_perc:0.00 194s current_save_keys_processed:0 194s current_save_keys_total:0 194s rdb_changes_since_last_save:0 194s rdb_bgsave_in_progress:0 194s rdb_last_save_time:1751586783 194s rdb_last_bgsave_status:ok 194s rdb_last_bgsave_time_sec:-1 194s rdb_current_bgsave_time_sec:-1 194s rdb_saves:0 194s rdb_last_cow_size:0 194s rdb_last_load_keys_expired:0 194s rdb_last_load_keys_loaded:0 194s aof_enabled:0 194s aof_rewrite_in_progress:0 194s aof_rewrite_scheduled:0 194s aof_last_rewrite_time_sec:-1 194s aof_current_rewrite_time_sec:-1 194s aof_last_bgrewrite_status:ok 194s aof_rewrites:0 194s aof_rewrites_consecutive_failures:0 194s aof_last_write_status:ok 194s aof_last_cow_size:0 194s module_fork_in_progress:0 194s module_fork_last_cow_size:0 194s 194s # Stats 194s total_connections_received:1 194s total_commands_processed:0 194s instantaneous_ops_per_sec:0 194s total_net_input_bytes:14 194s total_net_output_bytes:0 194s total_net_repl_input_bytes:0 194s total_net_repl_output_bytes:0 194s instantaneous_input_kbps:0.00 194s instantaneous_output_kbps:0.00 194s instantaneous_input_repl_kbps:0.00 194s instantaneous_output_repl_kbps:0.00 194s rejected_connections:0 194s sync_full:0 194s sync_partial_ok:0 194s sync_partial_err:0 194s expired_keys:0 194s expired_stale_perc:0.00 194s expired_time_cap_reached_count:0 194s expire_cycle_cpu_milliseconds:0 194s evicted_keys:0 194s evicted_clients:0 194s total_eviction_exceeded_time:0 194s current_eviction_exceeded_time:0 194s keyspace_hits:0 194s keyspace_misses:0 194s pubsub_channels:0 194s pubsub_patterns:0 194s pubsubshard_channels:0 194s latest_fork_usec:0 194s total_forks:0 194s migrate_cached_sockets:0 194s slave_expires_tracked_keys:0 194s active_defrag_hits:0 194s active_defrag_misses:0 194s active_defrag_key_hits:0 194s active_defrag_key_misses:0 194s total_active_defrag_time:0 194s current_active_defrag_time:0 194s tracking_total_keys:0 194s tracking_total_items:0 194s tracking_total_prefixes:0 194s unexpected_error_replies:0 194s total_error_replies:0 194s dump_payload_sanitizations:0 194s total_reads_processed:1 194s total_writes_processed:0 194s io_threaded_reads_processed:0 194s io_threaded_writes_processed:0 194s reply_buffer_shrinks:0 194s reply_buffer_expands:0 194s eventloop_cycles:51 194s eventloop_duration_sum:9197 194s eventloop_duration_cmd_sum:0 194s instantaneous_eventloop_cycles_per_sec:9 194s instantaneous_eventloop_duration_usec:171 194s acl_access_denied_auth:0 194s acl_access_denied_cmd:0 194s acl_access_denied_key:0 194s acl_access_denied_channel:0 194s 194s # Replication 194s role:master 194s connected_slaves:0 194s master_failover_state:no-failover 194s master_replid:7240a07c8e9917e9fd18947d2386cfa4ac4d9430 194s master_replid2:0000000000000000000000000000000000000000 194s master_repl_offset:0 194s second_repl_offset:-1 194s repl_backlog_active:0 194s repl_backlog_size:1048576 194s repl_backlog_first_byte_offset:0 194s repl_backlog_histlen:0 194s 194s # CPU 194s used_cpu_sys:0.045968 194s used_cpu_user:0.054140 194s used_cpu_sys_children:0.000691 194s used_cpu_user_children:0.000000 194s used_cpu_sys_main_thread:0.044965 194s used_cpu_user_main_thread:0.054163 194s 194s # Modules 194s 194s # Errorstats 194s 194s # Cluster 194s cluster_enabled:0 194s 194s # Keyspace 194s Redis ver. 7.2.8 195s autopkgtest [23:53:09]: test 0001-valkey-cli: -----------------------] 198s 0001-valkey-cli PASS 198s autopkgtest [23:53:12]: test 0001-valkey-cli: - - - - - - - - - - results - - - - - - - - - - 202s autopkgtest [23:53:16]: test 0002-benchmark: preparing testbed 204s Reading package lists... 204s Building dependency tree... 204s Reading state information... 204s Starting pkgProblemResolver with broken count: 0 204s Starting 2 pkgProblemResolver with broken count: 0 204s Done 205s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 212s autopkgtest [23:53:26]: test 0002-benchmark: [----------------------- 220s PING_INLINE: rps=0.0 (overall: nan) avg_msec=nan (overall: nan) ====== PING_INLINE ====== 220s 100000 requests completed in 0.22 seconds 220s 50 parallel clients 220s 3 bytes payload 220s keep alive: 1 220s host configuration "save": 3600 1 300 100 60 10000 220s host configuration "appendonly": no 220s multi-thread: no 220s 220s Latency by percentile distribution: 220s 0.000% <= 0.319 milliseconds (cumulative count 10) 220s 50.000% <= 0.967 milliseconds (cumulative count 51660) 220s 75.000% <= 1.087 milliseconds (cumulative count 75350) 220s 87.500% <= 1.175 milliseconds (cumulative count 88270) 220s 93.750% <= 1.231 milliseconds (cumulative count 94140) 220s 96.875% <= 1.287 milliseconds (cumulative count 96980) 220s 98.438% <= 1.359 milliseconds (cumulative count 98520) 220s 99.219% <= 1.439 milliseconds (cumulative count 99230) 220s 99.609% <= 1.639 milliseconds (cumulative count 99610) 220s 99.805% <= 1.831 milliseconds (cumulative count 99810) 220s 99.902% <= 2.359 milliseconds (cumulative count 99910) 220s 99.951% <= 2.471 milliseconds (cumulative count 99960) 220s 99.976% <= 2.519 milliseconds (cumulative count 99980) 220s 99.988% <= 2.535 milliseconds (cumulative count 99990) 220s 99.994% <= 2.615 milliseconds (cumulative count 100000) 220s 100.000% <= 2.615 milliseconds (cumulative count 100000) 220s 220s Cumulative distribution of latencies: 220s 0.000% <= 0.103 milliseconds (cumulative count 0) 220s 0.300% <= 0.407 milliseconds (cumulative count 300) 220s 0.660% <= 0.503 milliseconds (cumulative count 660) 220s 1.430% <= 0.607 milliseconds (cumulative count 1430) 220s 5.500% <= 0.703 milliseconds (cumulative count 5500) 220s 20.410% <= 0.807 milliseconds (cumulative count 20410) 220s 38.270% <= 0.903 milliseconds (cumulative count 38270) 220s 60.530% <= 1.007 milliseconds (cumulative count 60530) 220s 77.900% <= 1.103 milliseconds (cumulative count 77900) 220s 91.970% <= 1.207 milliseconds (cumulative count 91970) 220s 97.460% <= 1.303 milliseconds (cumulative count 97460) 220s 99.010% <= 1.407 milliseconds (cumulative count 99010) 220s 99.450% <= 1.503 milliseconds (cumulative count 99450) 220s 99.590% <= 1.607 milliseconds (cumulative count 99590) 220s 99.670% <= 1.703 milliseconds (cumulative count 99670) 220s 99.790% <= 1.807 milliseconds (cumulative count 99790) 220s 99.820% <= 1.903 milliseconds (cumulative count 99820) 220s 99.860% <= 2.007 milliseconds (cumulative count 99860) 220s 99.890% <= 2.103 milliseconds (cumulative count 99890) 220s 100.000% <= 3.103 milliseconds (cumulative count 100000) 220s 220s Summary: 220s throughput summary: 465116.28 requests per second 220s latency summary (msec): 220s avg min p50 p95 p99 max 220s 0.965 0.312 0.967 1.247 1.407 2.615 220s PING_MBULK: rps=69322.7 (overall: 511764.7) avg_msec=0.843 (overall: 0.843) ====== PING_MBULK ====== 220s 100000 requests completed in 0.19 seconds 220s 50 parallel clients 220s 3 bytes payload 220s keep alive: 1 220s host configuration "save": 3600 1 300 100 60 10000 220s host configuration "appendonly": no 220s multi-thread: no 220s 220s Latency by percentile distribution: 220s 0.000% <= 0.271 milliseconds (cumulative count 10) 220s 50.000% <= 0.831 milliseconds (cumulative count 51980) 220s 75.000% <= 0.951 milliseconds (cumulative count 76190) 220s 87.500% <= 1.031 milliseconds (cumulative count 88060) 220s 93.750% <= 1.079 milliseconds (cumulative count 94300) 220s 96.875% <= 1.119 milliseconds (cumulative count 97270) 220s 98.438% <= 1.151 milliseconds (cumulative count 98540) 220s 99.219% <= 1.191 milliseconds (cumulative count 99280) 220s 99.609% <= 1.223 milliseconds (cumulative count 99620) 220s 99.805% <= 1.271 milliseconds (cumulative count 99820) 220s 99.902% <= 1.295 milliseconds (cumulative count 99910) 220s 99.951% <= 1.327 milliseconds (cumulative count 99960) 220s 99.976% <= 1.343 milliseconds (cumulative count 99980) 220s 99.988% <= 1.367 milliseconds (cumulative count 99990) 220s 99.994% <= 1.391 milliseconds (cumulative count 100000) 220s 100.000% <= 1.391 milliseconds (cumulative count 100000) 220s 220s Cumulative distribution of latencies: 220s 0.000% <= 0.103 milliseconds (cumulative count 0) 220s 0.030% <= 0.303 milliseconds (cumulative count 30) 220s 0.270% <= 0.407 milliseconds (cumulative count 270) 220s 0.640% <= 0.503 milliseconds (cumulative count 640) 220s 3.200% <= 0.607 milliseconds (cumulative count 3200) 220s 20.710% <= 0.703 milliseconds (cumulative count 20710) 220s 46.220% <= 0.807 milliseconds (cumulative count 46220) 220s 67.900% <= 0.903 milliseconds (cumulative count 67900) 220s 84.630% <= 1.007 milliseconds (cumulative count 84630) 220s 96.280% <= 1.103 milliseconds (cumulative count 96280) 220s 99.480% <= 1.207 milliseconds (cumulative count 99480) 220s 99.920% <= 1.303 milliseconds (cumulative count 99920) 220s 100.000% <= 1.407 milliseconds (cumulative count 100000) 220s 220s Summary: 220s throughput summary: 529100.56 requests per second 220s latency summary (msec): 220s avg min p50 p95 p99 max 220s 0.837 0.264 0.831 1.087 1.175 1.391 220s SET: rps=156240.0 (overall: 420000.0) avg_msec=1.077 (overall: 1.077) ====== SET ====== 220s 100000 requests completed in 0.24 seconds 220s 50 parallel clients 220s 3 bytes payload 220s keep alive: 1 220s host configuration "save": 3600 1 300 100 60 10000 220s host configuration "appendonly": no 220s multi-thread: no 220s 220s Latency by percentile distribution: 220s 0.000% <= 0.343 milliseconds (cumulative count 10) 220s 50.000% <= 1.103 milliseconds (cumulative count 51550) 220s 75.000% <= 1.231 milliseconds (cumulative count 76260) 220s 87.500% <= 1.311 milliseconds (cumulative count 87770) 220s 93.750% <= 1.367 milliseconds (cumulative count 94050) 220s 96.875% <= 1.415 milliseconds (cumulative count 97070) 220s 98.438% <= 1.463 milliseconds (cumulative count 98590) 220s 99.219% <= 1.511 milliseconds (cumulative count 99290) 220s 99.609% <= 1.559 milliseconds (cumulative count 99660) 220s 99.805% <= 1.607 milliseconds (cumulative count 99820) 220s 99.902% <= 1.663 milliseconds (cumulative count 99920) 220s 99.951% <= 1.703 milliseconds (cumulative count 99960) 220s 99.976% <= 1.743 milliseconds (cumulative count 99980) 220s 99.988% <= 1.751 milliseconds (cumulative count 99990) 220s 99.994% <= 1.887 milliseconds (cumulative count 100000) 220s 100.000% <= 1.887 milliseconds (cumulative count 100000) 220s 220s Cumulative distribution of latencies: 220s 0.000% <= 0.103 milliseconds (cumulative count 0) 220s 0.050% <= 0.407 milliseconds (cumulative count 50) 220s 0.120% <= 0.503 milliseconds (cumulative count 120) 220s 0.130% <= 0.607 milliseconds (cumulative count 130) 220s 0.310% <= 0.703 milliseconds (cumulative count 310) 220s 6.340% <= 0.807 milliseconds (cumulative count 6340) 220s 17.230% <= 0.903 milliseconds (cumulative count 17230) 220s 31.680% <= 1.007 milliseconds (cumulative count 31680) 220s 51.550% <= 1.103 milliseconds (cumulative count 51550) 220s 72.310% <= 1.207 milliseconds (cumulative count 72310) 220s 86.790% <= 1.303 milliseconds (cumulative count 86790) 220s 96.770% <= 1.407 milliseconds (cumulative count 96770) 220s 99.190% <= 1.503 milliseconds (cumulative count 99190) 220s 99.820% <= 1.607 milliseconds (cumulative count 99820) 220s 99.960% <= 1.703 milliseconds (cumulative count 99960) 220s 99.990% <= 1.807 milliseconds (cumulative count 99990) 220s 100.000% <= 1.903 milliseconds (cumulative count 100000) 220s 220s Summary: 220s throughput summary: 414937.75 requests per second 220s latency summary (msec): 220s avg min p50 p95 p99 max 220s 1.094 0.336 1.103 1.383 1.487 1.887 220s GET: rps=180280.0 (overall: 446237.6) avg_msec=1.005 (overall: 1.005) ====== GET ====== 220s 100000 requests completed in 0.22 seconds 220s 50 parallel clients 220s 3 bytes payload 220s keep alive: 1 220s host configuration "save": 3600 1 300 100 60 10000 220s host configuration "appendonly": no 220s multi-thread: no 220s 220s Latency by percentile distribution: 220s 0.000% <= 0.359 milliseconds (cumulative count 10) 220s 50.000% <= 0.991 milliseconds (cumulative count 50960) 220s 75.000% <= 1.111 milliseconds (cumulative count 75210) 220s 87.500% <= 1.199 milliseconds (cumulative count 88390) 220s 93.750% <= 1.247 milliseconds (cumulative count 94020) 220s 96.875% <= 1.287 milliseconds (cumulative count 96960) 220s 98.438% <= 1.327 milliseconds (cumulative count 98500) 220s 99.219% <= 1.367 milliseconds (cumulative count 99230) 220s 99.609% <= 1.399 milliseconds (cumulative count 99610) 220s 99.805% <= 1.447 milliseconds (cumulative count 99810) 220s 99.902% <= 1.471 milliseconds (cumulative count 99910) 220s 99.951% <= 1.519 milliseconds (cumulative count 99970) 220s 99.976% <= 1.527 milliseconds (cumulative count 99980) 220s 99.988% <= 1.551 milliseconds (cumulative count 99990) 220s 99.994% <= 1.583 milliseconds (cumulative count 100000) 220s 100.000% <= 1.583 milliseconds (cumulative count 100000) 220s 220s Cumulative distribution of latencies: 220s 0.000% <= 0.103 milliseconds (cumulative count 0) 220s 0.050% <= 0.407 milliseconds (cumulative count 50) 220s 0.120% <= 0.503 milliseconds (cumulative count 120) 220s 0.220% <= 0.607 milliseconds (cumulative count 220) 220s 2.230% <= 0.703 milliseconds (cumulative count 2230) 220s 15.530% <= 0.807 milliseconds (cumulative count 15530) 220s 32.420% <= 0.903 milliseconds (cumulative count 32420) 220s 54.640% <= 1.007 milliseconds (cumulative count 54640) 220s 74.020% <= 1.103 milliseconds (cumulative count 74020) 220s 89.370% <= 1.207 milliseconds (cumulative count 89370) 220s 97.720% <= 1.303 milliseconds (cumulative count 97720) 220s 99.660% <= 1.407 milliseconds (cumulative count 99660) 220s 99.940% <= 1.503 milliseconds (cumulative count 99940) 220s 100.000% <= 1.607 milliseconds (cumulative count 100000) 220s 220s Summary: 220s throughput summary: 452488.69 requests per second 220s latency summary (msec): 220s avg min p50 p95 p99 max 220s 0.990 0.352 0.991 1.263 1.359 1.583 221s INCR: rps=206414.4 (overall: 401627.9) avg_msec=1.124 (overall: 1.124) ====== INCR ====== 221s 100000 requests completed in 0.24 seconds 221s 50 parallel clients 221s 3 bytes payload 221s keep alive: 1 221s host configuration "save": 3600 1 300 100 60 10000 221s host configuration "appendonly": no 221s multi-thread: no 221s 221s Latency by percentile distribution: 221s 0.000% <= 0.319 milliseconds (cumulative count 10) 221s 50.000% <= 1.047 milliseconds (cumulative count 51530) 221s 75.000% <= 1.191 milliseconds (cumulative count 75920) 221s 87.500% <= 1.287 milliseconds (cumulative count 87740) 221s 93.750% <= 1.351 milliseconds (cumulative count 93860) 221s 96.875% <= 1.423 milliseconds (cumulative count 97080) 221s 98.438% <= 1.503 milliseconds (cumulative count 98520) 221s 99.219% <= 1.919 milliseconds (cumulative count 99220) 221s 99.609% <= 9.215 milliseconds (cumulative count 99610) 221s 99.805% <= 9.647 milliseconds (cumulative count 99810) 221s 99.902% <= 10.199 milliseconds (cumulative count 99910) 221s 99.951% <= 10.335 milliseconds (cumulative count 99960) 221s 99.976% <= 10.455 milliseconds (cumulative count 99980) 221s 99.988% <= 10.503 milliseconds (cumulative count 99990) 221s 99.994% <= 10.551 milliseconds (cumulative count 100000) 221s 100.000% <= 10.551 milliseconds (cumulative count 100000) 221s 221s Cumulative distribution of latencies: 221s 0.000% <= 0.103 milliseconds (cumulative count 0) 221s 0.070% <= 0.407 milliseconds (cumulative count 70) 221s 0.220% <= 0.503 milliseconds (cumulative count 220) 221s 0.590% <= 0.607 milliseconds (cumulative count 590) 221s 1.340% <= 0.703 milliseconds (cumulative count 1340) 221s 9.160% <= 0.807 milliseconds (cumulative count 9160) 221s 25.280% <= 0.903 milliseconds (cumulative count 25280) 221s 43.750% <= 1.007 milliseconds (cumulative count 43750) 221s 62.090% <= 1.103 milliseconds (cumulative count 62090) 221s 78.140% <= 1.207 milliseconds (cumulative count 78140) 221s 89.540% <= 1.303 milliseconds (cumulative count 89540) 221s 96.610% <= 1.407 milliseconds (cumulative count 96610) 221s 98.520% <= 1.503 milliseconds (cumulative count 98520) 221s 99.010% <= 1.607 milliseconds (cumulative count 99010) 221s 99.120% <= 1.703 milliseconds (cumulative count 99120) 221s 99.140% <= 1.807 milliseconds (cumulative count 99140) 221s 99.210% <= 1.903 milliseconds (cumulative count 99210) 221s 99.280% <= 2.007 milliseconds (cumulative count 99280) 221s 99.320% <= 2.103 milliseconds (cumulative count 99320) 221s 99.500% <= 3.103 milliseconds (cumulative count 99500) 221s 99.510% <= 9.103 milliseconds (cumulative count 99510) 221s 99.850% <= 10.103 milliseconds (cumulative count 99850) 221s 100.000% <= 11.103 milliseconds (cumulative count 100000) 221s 221s Summary: 221s throughput summary: 413223.16 requests per second 221s latency summary (msec): 221s avg min p50 p95 p99 max 221s 1.093 0.312 1.047 1.375 1.599 10.551 221s LPUSH: rps=184422.3 (overall: 342888.9) avg_msec=1.332 (overall: 1.332) ====== LPUSH ====== 221s 100000 requests completed in 0.29 seconds 221s 50 parallel clients 221s 3 bytes payload 221s keep alive: 1 221s host configuration "save": 3600 1 300 100 60 10000 221s host configuration "appendonly": no 221s multi-thread: no 221s 221s Latency by percentile distribution: 221s 0.000% <= 0.479 milliseconds (cumulative count 10) 221s 50.000% <= 1.343 milliseconds (cumulative count 51190) 221s 75.000% <= 1.495 milliseconds (cumulative count 76060) 221s 87.500% <= 1.591 milliseconds (cumulative count 87580) 221s 93.750% <= 1.671 milliseconds (cumulative count 93960) 221s 96.875% <= 1.735 milliseconds (cumulative count 96920) 221s 98.438% <= 1.799 milliseconds (cumulative count 98490) 221s 99.219% <= 1.871 milliseconds (cumulative count 99280) 221s 99.609% <= 1.911 milliseconds (cumulative count 99620) 221s 99.805% <= 1.967 milliseconds (cumulative count 99820) 221s 99.902% <= 1.999 milliseconds (cumulative count 99920) 221s 99.951% <= 2.039 milliseconds (cumulative count 99960) 221s 99.976% <= 2.055 milliseconds (cumulative count 99980) 221s 99.988% <= 2.071 milliseconds (cumulative count 99990) 221s 99.994% <= 2.079 milliseconds (cumulative count 100000) 221s 100.000% <= 2.079 milliseconds (cumulative count 100000) 221s 221s Cumulative distribution of latencies: 221s 0.000% <= 0.103 milliseconds (cumulative count 0) 221s 0.020% <= 0.503 milliseconds (cumulative count 20) 221s 0.150% <= 0.607 milliseconds (cumulative count 150) 221s 0.290% <= 0.703 milliseconds (cumulative count 290) 221s 1.020% <= 0.807 milliseconds (cumulative count 1020) 221s 5.150% <= 0.903 milliseconds (cumulative count 5150) 221s 12.070% <= 1.007 milliseconds (cumulative count 12070) 221s 18.630% <= 1.103 milliseconds (cumulative count 18630) 221s 29.520% <= 1.207 milliseconds (cumulative count 29520) 221s 44.620% <= 1.303 milliseconds (cumulative count 44620) 221s 62.150% <= 1.407 milliseconds (cumulative count 62150) 221s 77.000% <= 1.503 milliseconds (cumulative count 77000) 221s 89.080% <= 1.607 milliseconds (cumulative count 89080) 221s 95.630% <= 1.703 milliseconds (cumulative count 95630) 221s 98.580% <= 1.807 milliseconds (cumulative count 98580) 221s 99.520% <= 1.903 milliseconds (cumulative count 99520) 221s 99.920% <= 2.007 milliseconds (cumulative count 99920) 221s 100.000% <= 2.103 milliseconds (cumulative count 100000) 221s 221s Summary: 221s throughput summary: 348432.06 requests per second 221s latency summary (msec): 221s avg min p50 p95 p99 max 221s 1.322 0.472 1.343 1.695 1.847 2.079 221s RPUSH: rps=155440.0 (overall: 404791.7) avg_msec=1.110 (overall: 1.110) ====== RPUSH ====== 221s 100000 requests completed in 0.25 seconds 221s 50 parallel clients 221s 3 bytes payload 221s keep alive: 1 221s host configuration "save": 3600 1 300 100 60 10000 221s host configuration "appendonly": no 221s multi-thread: no 221s 221s Latency by percentile distribution: 221s 0.000% <= 0.327 milliseconds (cumulative count 10) 221s 50.000% <= 1.127 milliseconds (cumulative count 51330) 221s 75.000% <= 1.263 milliseconds (cumulative count 75060) 221s 87.500% <= 1.359 milliseconds (cumulative count 87560) 221s 93.750% <= 1.423 milliseconds (cumulative count 94200) 221s 96.875% <= 1.471 milliseconds (cumulative count 96960) 221s 98.438% <= 1.519 milliseconds (cumulative count 98440) 221s 99.219% <= 1.567 milliseconds (cumulative count 99250) 221s 99.609% <= 1.631 milliseconds (cumulative count 99640) 221s 99.805% <= 1.695 milliseconds (cumulative count 99810) 221s 99.902% <= 1.743 milliseconds (cumulative count 99910) 221s 99.951% <= 1.807 milliseconds (cumulative count 99960) 221s 99.976% <= 1.815 milliseconds (cumulative count 99980) 221s 99.988% <= 1.847 milliseconds (cumulative count 99990) 221s 99.994% <= 1.863 milliseconds (cumulative count 100000) 221s 100.000% <= 1.863 milliseconds (cumulative count 100000) 221s 221s Cumulative distribution of latencies: 221s 0.000% <= 0.103 milliseconds (cumulative count 0) 221s 0.070% <= 0.407 milliseconds (cumulative count 70) 221s 0.160% <= 0.503 milliseconds (cumulative count 160) 221s 0.410% <= 0.607 milliseconds (cumulative count 410) 221s 0.970% <= 0.703 milliseconds (cumulative count 970) 221s 4.390% <= 0.807 milliseconds (cumulative count 4390) 221s 15.780% <= 0.903 milliseconds (cumulative count 15780) 221s 30.040% <= 1.007 milliseconds (cumulative count 30040) 221s 46.670% <= 1.103 milliseconds (cumulative count 46670) 221s 66.220% <= 1.207 milliseconds (cumulative count 66220) 221s 80.570% <= 1.303 milliseconds (cumulative count 80570) 221s 92.830% <= 1.407 milliseconds (cumulative count 92830) 221s 98.050% <= 1.503 milliseconds (cumulative count 98050) 221s 99.560% <= 1.607 milliseconds (cumulative count 99560) 221s 99.830% <= 1.703 milliseconds (cumulative count 99830) 221s 99.960% <= 1.807 milliseconds (cumulative count 99960) 221s 100.000% <= 1.903 milliseconds (cumulative count 100000) 221s 221s Summary: 221s throughput summary: 404858.31 requests per second 221s latency summary (msec): 221s avg min p50 p95 p99 max 221s 1.120 0.320 1.127 1.439 1.551 1.863 221s LPOP: rps=136320.0 (overall: 351340.2) avg_msec=1.303 (overall: 1.303) ====== LPOP ====== 221s 100000 requests completed in 0.29 seconds 221s 50 parallel clients 221s 3 bytes payload 221s keep alive: 1 221s host configuration "save": 3600 1 300 100 60 10000 221s host configuration "appendonly": no 221s multi-thread: no 221s 221s Latency by percentile distribution: 221s 0.000% <= 0.335 milliseconds (cumulative count 10) 221s 50.000% <= 1.327 milliseconds (cumulative count 50210) 221s 75.000% <= 1.479 milliseconds (cumulative count 75400) 221s 87.500% <= 1.583 milliseconds (cumulative count 88320) 221s 93.750% <= 1.647 milliseconds (cumulative count 94090) 221s 96.875% <= 1.703 milliseconds (cumulative count 97110) 221s 98.438% <= 1.751 milliseconds (cumulative count 98560) 221s 99.219% <= 1.791 milliseconds (cumulative count 99250) 221s 99.609% <= 1.839 milliseconds (cumulative count 99640) 221s 99.805% <= 1.871 milliseconds (cumulative count 99820) 221s 99.902% <= 1.903 milliseconds (cumulative count 99910) 221s 99.951% <= 1.919 milliseconds (cumulative count 99960) 221s 99.976% <= 1.951 milliseconds (cumulative count 99980) 221s 99.988% <= 1.967 milliseconds (cumulative count 99990) 221s 99.994% <= 1.991 milliseconds (cumulative count 100000) 221s 100.000% <= 1.991 milliseconds (cumulative count 100000) 221s 221s Cumulative distribution of latencies: 221s 0.000% <= 0.103 milliseconds (cumulative count 0) 221s 0.060% <= 0.407 milliseconds (cumulative count 60) 221s 0.130% <= 0.503 milliseconds (cumulative count 130) 221s 0.180% <= 0.607 milliseconds (cumulative count 180) 221s 0.210% <= 0.703 milliseconds (cumulative count 210) 221s 0.850% <= 0.807 milliseconds (cumulative count 850) 221s 4.350% <= 0.903 milliseconds (cumulative count 4350) 221s 12.160% <= 1.007 milliseconds (cumulative count 12160) 221s 20.780% <= 1.103 milliseconds (cumulative count 20780) 221s 31.190% <= 1.207 milliseconds (cumulative count 31190) 221s 45.890% <= 1.303 milliseconds (cumulative count 45890) 221s 63.780% <= 1.407 milliseconds (cumulative count 63780) 221s 78.760% <= 1.503 milliseconds (cumulative count 78760) 221s 90.650% <= 1.607 milliseconds (cumulative count 90650) 221s 97.110% <= 1.703 milliseconds (cumulative count 97110) 221s 99.430% <= 1.807 milliseconds (cumulative count 99430) 221s 99.910% <= 1.903 milliseconds (cumulative count 99910) 221s 100.000% <= 2.007 milliseconds (cumulative count 100000) 221s 221s Summary: 221s throughput summary: 349650.34 requests per second 221s latency summary (msec): 221s avg min p50 p95 p99 max 221s 1.311 0.328 1.327 1.663 1.775 1.991 222s RPOP: rps=88207.2 (overall: 369000.0) avg_msec=1.228 (overall: 1.228) ====== RPOP ====== 222s 100000 requests completed in 0.27 seconds 222s 50 parallel clients 222s 3 bytes payload 222s keep alive: 1 222s host configuration "save": 3600 1 300 100 60 10000 222s host configuration "appendonly": no 222s multi-thread: no 222s 222s Latency by percentile distribution: 222s 0.000% <= 0.455 milliseconds (cumulative count 10) 222s 50.000% <= 1.231 milliseconds (cumulative count 51150) 222s 75.000% <= 1.375 milliseconds (cumulative count 75210) 222s 87.500% <= 1.471 milliseconds (cumulative count 87700) 222s 93.750% <= 1.535 milliseconds (cumulative count 94020) 222s 96.875% <= 1.591 milliseconds (cumulative count 96940) 222s 98.438% <= 1.671 milliseconds (cumulative count 98470) 222s 99.219% <= 1.807 milliseconds (cumulative count 99220) 222s 99.609% <= 2.271 milliseconds (cumulative count 99610) 222s 99.805% <= 9.191 milliseconds (cumulative count 99830) 222s 99.902% <= 9.327 milliseconds (cumulative count 99920) 222s 99.951% <= 9.727 milliseconds (cumulative count 99960) 222s 99.976% <= 9.839 milliseconds (cumulative count 99980) 222s 99.988% <= 9.871 milliseconds (cumulative count 99990) 222s 99.994% <= 9.895 milliseconds (cumulative count 100000) 222s 100.000% <= 9.895 milliseconds (cumulative count 100000) 222s 222s Cumulative distribution of latencies: 222s 0.000% <= 0.103 milliseconds (cumulative count 0) 222s 0.080% <= 0.503 milliseconds (cumulative count 80) 222s 0.260% <= 0.607 milliseconds (cumulative count 260) 222s 0.500% <= 0.703 milliseconds (cumulative count 500) 222s 1.760% <= 0.807 milliseconds (cumulative count 1760) 222s 9.620% <= 0.903 milliseconds (cumulative count 9620) 222s 19.990% <= 1.007 milliseconds (cumulative count 19990) 222s 29.260% <= 1.103 milliseconds (cumulative count 29260) 222s 46.440% <= 1.207 milliseconds (cumulative count 46440) 222s 64.110% <= 1.303 milliseconds (cumulative count 64110) 222s 79.540% <= 1.407 milliseconds (cumulative count 79540) 222s 91.280% <= 1.503 milliseconds (cumulative count 91280) 222s 97.390% <= 1.607 milliseconds (cumulative count 97390) 222s 98.760% <= 1.703 milliseconds (cumulative count 98760) 222s 99.220% <= 1.807 milliseconds (cumulative count 99220) 222s 99.410% <= 1.903 milliseconds (cumulative count 99410) 222s 99.500% <= 2.007 milliseconds (cumulative count 99500) 222s 99.550% <= 2.103 milliseconds (cumulative count 99550) 222s 99.610% <= 3.103 milliseconds (cumulative count 99610) 222s 99.730% <= 9.103 milliseconds (cumulative count 99730) 222s 100.000% <= 10.103 milliseconds (cumulative count 100000) 222s 222s Summary: 222s throughput summary: 366300.38 requests per second 222s latency summary (msec): 222s avg min p50 p95 p99 max 222s 1.247 0.448 1.231 1.559 1.743 9.895 222s SADD: rps=61280.0 (overall: 450588.2) avg_msec=0.987 (overall: 0.987) ====== SADD ====== 222s 100000 requests completed in 0.22 seconds 222s 50 parallel clients 222s 3 bytes payload 222s keep alive: 1 222s host configuration "save": 3600 1 300 100 60 10000 222s host configuration "appendonly": no 222s multi-thread: no 222s 222s Latency by percentile distribution: 222s 0.000% <= 0.327 milliseconds (cumulative count 20) 222s 50.000% <= 0.999 milliseconds (cumulative count 50070) 222s 75.000% <= 1.119 milliseconds (cumulative count 75470) 222s 87.500% <= 1.199 milliseconds (cumulative count 88020) 222s 93.750% <= 1.247 milliseconds (cumulative count 93970) 222s 96.875% <= 1.287 milliseconds (cumulative count 96970) 222s 98.438% <= 1.327 milliseconds (cumulative count 98500) 222s 99.219% <= 1.367 milliseconds (cumulative count 99300) 222s 99.609% <= 1.391 milliseconds (cumulative count 99610) 222s 99.805% <= 1.431 milliseconds (cumulative count 99820) 222s 99.902% <= 1.487 milliseconds (cumulative count 99910) 222s 99.951% <= 1.527 milliseconds (cumulative count 99960) 222s 99.976% <= 1.575 milliseconds (cumulative count 99980) 222s 99.988% <= 1.607 milliseconds (cumulative count 99990) 222s 99.994% <= 1.679 milliseconds (cumulative count 100000) 222s 100.000% <= 1.679 milliseconds (cumulative count 100000) 222s 222s Cumulative distribution of latencies: 222s 0.000% <= 0.103 milliseconds (cumulative count 0) 222s 0.130% <= 0.407 milliseconds (cumulative count 130) 222s 0.270% <= 0.503 milliseconds (cumulative count 270) 222s 0.350% <= 0.607 milliseconds (cumulative count 350) 222s 2.170% <= 0.703 milliseconds (cumulative count 2170) 222s 15.840% <= 0.807 milliseconds (cumulative count 15840) 222s 29.960% <= 0.903 milliseconds (cumulative count 29960) 222s 52.130% <= 1.007 milliseconds (cumulative count 52130) 222s 72.590% <= 1.103 milliseconds (cumulative count 72590) 222s 89.090% <= 1.207 milliseconds (cumulative count 89090) 222s 97.660% <= 1.303 milliseconds (cumulative count 97660) 222s 99.700% <= 1.407 milliseconds (cumulative count 99700) 222s 99.940% <= 1.503 milliseconds (cumulative count 99940) 222s 99.990% <= 1.607 milliseconds (cumulative count 99990) 222s 100.000% <= 1.703 milliseconds (cumulative count 100000) 222s 222s Summary: 222s throughput summary: 452488.69 requests per second 222s latency summary (msec): 222s avg min p50 p95 p99 max 222s 0.996 0.320 0.999 1.263 1.351 1.679 222s HSET: rps=101514.0 (overall: 410967.8) avg_msec=1.104 (overall: 1.104) ====== HSET ====== 222s 100000 requests completed in 0.24 seconds 222s 50 parallel clients 222s 3 bytes payload 222s keep alive: 1 222s host configuration "save": 3600 1 300 100 60 10000 222s host configuration "appendonly": no 222s multi-thread: no 222s 222s Latency by percentile distribution: 222s 0.000% <= 0.407 milliseconds (cumulative count 10) 222s 50.000% <= 1.111 milliseconds (cumulative count 50380) 222s 75.000% <= 1.231 milliseconds (cumulative count 75790) 222s 87.500% <= 1.311 milliseconds (cumulative count 88080) 222s 93.750% <= 1.359 milliseconds (cumulative count 93860) 222s 96.875% <= 1.399 milliseconds (cumulative count 96960) 222s 98.438% <= 1.439 milliseconds (cumulative count 98530) 222s 99.219% <= 1.479 milliseconds (cumulative count 99250) 222s 99.609% <= 1.519 milliseconds (cumulative count 99650) 222s 99.805% <= 1.551 milliseconds (cumulative count 99820) 222s 99.902% <= 1.583 milliseconds (cumulative count 99910) 222s 99.951% <= 1.615 milliseconds (cumulative count 99960) 222s 99.976% <= 1.631 milliseconds (cumulative count 99980) 222s 99.988% <= 1.647 milliseconds (cumulative count 99990) 222s 99.994% <= 1.727 milliseconds (cumulative count 100000) 222s 100.000% <= 1.727 milliseconds (cumulative count 100000) 222s 222s Cumulative distribution of latencies: 222s 0.000% <= 0.103 milliseconds (cumulative count 0) 222s 0.010% <= 0.407 milliseconds (cumulative count 10) 222s 0.100% <= 0.503 milliseconds (cumulative count 100) 222s 0.230% <= 0.607 milliseconds (cumulative count 230) 222s 0.400% <= 0.703 milliseconds (cumulative count 400) 222s 7.440% <= 0.807 milliseconds (cumulative count 7440) 222s 17.520% <= 0.903 milliseconds (cumulative count 17520) 222s 28.900% <= 1.007 milliseconds (cumulative count 28900) 222s 48.560% <= 1.103 milliseconds (cumulative count 48560) 222s 71.470% <= 1.207 milliseconds (cumulative count 71470) 222s 87.120% <= 1.303 milliseconds (cumulative count 87120) 222s 97.340% <= 1.407 milliseconds (cumulative count 97340) 222s 99.490% <= 1.503 milliseconds (cumulative count 99490) 222s 99.950% <= 1.607 milliseconds (cumulative count 99950) 222s 99.990% <= 1.703 milliseconds (cumulative count 99990) 222s 100.000% <= 1.807 milliseconds (cumulative count 100000) 222s 222s Summary: 222s throughput summary: 414937.75 requests per second 222s latency summary (msec): 222s avg min p50 p95 p99 max 222s 1.098 0.400 1.111 1.375 1.471 1.727 222s SPOP: rps=140520.0 (overall: 501857.2) avg_msec=0.879 (overall: 0.879) ====== SPOP ====== 222s 100000 requests completed in 0.20 seconds 222s 50 parallel clients 222s 3 bytes payload 222s keep alive: 1 222s host configuration "save": 3600 1 300 100 60 10000 222s host configuration "appendonly": no 222s multi-thread: no 222s 222s Latency by percentile distribution: 222s 0.000% <= 0.359 milliseconds (cumulative count 10) 222s 50.000% <= 0.903 milliseconds (cumulative count 51440) 222s 75.000% <= 1.023 milliseconds (cumulative count 76000) 222s 87.500% <= 1.103 milliseconds (cumulative count 88280) 222s 93.750% <= 1.159 milliseconds (cumulative count 94510) 222s 96.875% <= 1.199 milliseconds (cumulative count 97030) 222s 98.438% <= 1.239 milliseconds (cumulative count 98520) 222s 99.219% <= 1.279 milliseconds (cumulative count 99250) 222s 99.609% <= 1.327 milliseconds (cumulative count 99640) 222s 99.805% <= 1.367 milliseconds (cumulative count 99830) 222s 99.902% <= 1.407 milliseconds (cumulative count 99920) 222s 99.951% <= 1.431 milliseconds (cumulative count 99960) 222s 99.976% <= 1.455 milliseconds (cumulative count 99980) 222s 99.988% <= 1.487 milliseconds (cumulative count 99990) 222s 99.994% <= 1.535 milliseconds (cumulative count 100000) 222s 100.000% <= 1.535 milliseconds (cumulative count 100000) 222s 222s Cumulative distribution of latencies: 222s 0.000% <= 0.103 milliseconds (cumulative count 0) 222s 0.020% <= 0.407 milliseconds (cumulative count 20) 222s 0.250% <= 0.503 milliseconds (cumulative count 250) 222s 1.480% <= 0.607 milliseconds (cumulative count 1480) 222s 10.310% <= 0.703 milliseconds (cumulative count 10310) 222s 30.620% <= 0.807 milliseconds (cumulative count 30620) 222s 51.440% <= 0.903 milliseconds (cumulative count 51440) 222s 73.270% <= 1.007 milliseconds (cumulative count 73270) 222s 88.280% <= 1.103 milliseconds (cumulative count 88280) 222s 97.390% <= 1.207 milliseconds (cumulative count 97390) 222s 99.440% <= 1.303 milliseconds (cumulative count 99440) 222s 99.920% <= 1.407 milliseconds (cumulative count 99920) 222s 99.990% <= 1.503 milliseconds (cumulative count 99990) 222s 100.000% <= 1.607 milliseconds (cumulative count 100000) 222s 222s Summary: 222s throughput summary: 495049.50 requests per second 222s latency summary (msec): 222s avg min p50 p95 p99 max 222s 0.903 0.352 0.903 1.167 1.263 1.535 223s ZADD: rps=173880.0 (overall: 374741.4) avg_msec=1.226 (overall: 1.226) ====== ZADD ====== 223s 100000 requests completed in 0.27 seconds 223s 50 parallel clients 223s 3 bytes payload 223s keep alive: 1 223s host configuration "save": 3600 1 300 100 60 10000 223s host configuration "appendonly": no 223s multi-thread: no 223s 223s Latency by percentile distribution: 223s 0.000% <= 0.319 milliseconds (cumulative count 10) 223s 50.000% <= 1.255 milliseconds (cumulative count 50990) 223s 75.000% <= 1.391 milliseconds (cumulative count 76190) 223s 87.500% <= 1.471 milliseconds (cumulative count 87750) 223s 93.750% <= 1.535 milliseconds (cumulative count 94160) 223s 96.875% <= 1.583 milliseconds (cumulative count 97060) 223s 98.438% <= 1.631 milliseconds (cumulative count 98660) 223s 99.219% <= 1.671 milliseconds (cumulative count 99280) 223s 99.609% <= 1.703 milliseconds (cumulative count 99610) 223s 99.805% <= 1.751 milliseconds (cumulative count 99810) 223s 99.902% <= 1.791 milliseconds (cumulative count 99920) 223s 99.951% <= 1.823 milliseconds (cumulative count 99960) 223s 99.976% <= 1.855 milliseconds (cumulative count 99980) 223s 99.988% <= 1.887 milliseconds (cumulative count 99990) 223s 99.994% <= 1.967 milliseconds (cumulative count 100000) 223s 100.000% <= 1.967 milliseconds (cumulative count 100000) 223s 223s Cumulative distribution of latencies: 223s 0.000% <= 0.103 milliseconds (cumulative count 0) 223s 0.070% <= 0.407 milliseconds (cumulative count 70) 223s 0.140% <= 0.503 milliseconds (cumulative count 140) 223s 0.300% <= 0.607 milliseconds (cumulative count 300) 223s 0.430% <= 0.703 milliseconds (cumulative count 430) 223s 2.280% <= 0.807 milliseconds (cumulative count 2280) 223s 9.060% <= 0.903 milliseconds (cumulative count 9060) 223s 17.450% <= 1.007 milliseconds (cumulative count 17450) 223s 25.160% <= 1.103 milliseconds (cumulative count 25160) 223s 41.540% <= 1.207 milliseconds (cumulative count 41540) 223s 60.490% <= 1.303 milliseconds (cumulative count 60490) 223s 78.800% <= 1.407 milliseconds (cumulative count 78800) 223s 91.300% <= 1.503 milliseconds (cumulative count 91300) 223s 98.040% <= 1.607 milliseconds (cumulative count 98040) 223s 99.610% <= 1.703 milliseconds (cumulative count 99610) 223s 99.940% <= 1.807 milliseconds (cumulative count 99940) 223s 99.990% <= 1.903 milliseconds (cumulative count 99990) 223s 100.000% <= 2.007 milliseconds (cumulative count 100000) 223s 223s Summary: 223s throughput summary: 373134.31 requests per second 223s latency summary (msec): 223s avg min p50 p95 p99 max 223s 1.231 0.312 1.255 1.551 1.655 1.967 223s ZPOPMIN: rps=172589.7 (overall: 446597.9) avg_msec=0.999 (overall: 0.999) ====== ZPOPMIN ====== 223s 100000 requests completed in 0.21 seconds 223s 50 parallel clients 223s 3 bytes payload 223s keep alive: 1 223s host configuration "save": 3600 1 300 100 60 10000 223s host configuration "appendonly": no 223s multi-thread: no 223s 223s Latency by percentile distribution: 223s 0.000% <= 0.287 milliseconds (cumulative count 10) 223s 50.000% <= 0.927 milliseconds (cumulative count 51440) 223s 75.000% <= 1.047 milliseconds (cumulative count 75420) 223s 87.500% <= 1.135 milliseconds (cumulative count 88520) 223s 93.750% <= 1.183 milliseconds (cumulative count 94100) 223s 96.875% <= 1.231 milliseconds (cumulative count 96970) 223s 98.438% <= 1.295 milliseconds (cumulative count 98530) 223s 99.219% <= 1.367 milliseconds (cumulative count 99220) 223s 99.609% <= 7.831 milliseconds (cumulative count 99620) 223s 99.805% <= 8.695 milliseconds (cumulative count 99810) 223s 99.902% <= 8.807 milliseconds (cumulative count 99910) 223s 99.951% <= 8.887 milliseconds (cumulative count 99960) 223s 99.976% <= 8.919 milliseconds (cumulative count 99980) 223s 99.988% <= 8.935 milliseconds (cumulative count 99990) 223s 99.994% <= 8.951 milliseconds (cumulative count 100000) 223s 100.000% <= 8.951 milliseconds (cumulative count 100000) 223s 223s Cumulative distribution of latencies: 223s 0.000% <= 0.103 milliseconds (cumulative count 0) 223s 0.010% <= 0.303 milliseconds (cumulative count 10) 223s 0.160% <= 0.407 milliseconds (cumulative count 160) 223s 0.440% <= 0.503 milliseconds (cumulative count 440) 223s 1.250% <= 0.607 milliseconds (cumulative count 1250) 223s 7.860% <= 0.703 milliseconds (cumulative count 7860) 223s 26.210% <= 0.807 milliseconds (cumulative count 26210) 223s 45.780% <= 0.903 milliseconds (cumulative count 45780) 223s 68.700% <= 1.007 milliseconds (cumulative count 68700) 223s 84.170% <= 1.103 milliseconds (cumulative count 84170) 223s 95.780% <= 1.207 milliseconds (cumulative count 95780) 223s 98.670% <= 1.303 milliseconds (cumulative count 98670) 223s 99.390% <= 1.407 milliseconds (cumulative count 99390) 223s 99.450% <= 1.503 milliseconds (cumulative count 99450) 223s 99.490% <= 1.607 milliseconds (cumulative count 99490) 223s 99.560% <= 1.703 milliseconds (cumulative count 99560) 223s 99.580% <= 1.807 milliseconds (cumulative count 99580) 223s 99.630% <= 8.103 milliseconds (cumulative count 99630) 223s 100.000% <= 9.103 milliseconds (cumulative count 100000) 223s 223s Summary: 223s throughput summary: 469483.56 requests per second 223s latency summary (msec): 223s avg min p50 p95 p99 max 223s 0.957 0.280 0.927 1.199 1.343 8.951 223s LPUSH (needed to benchmark LRANGE): rps=180680.0 (overall: 344809.2) avg_msec=1.324 (overall: 1.324) ====== LPUSH (needed to benchmark LRANGE) ====== 223s 100000 requests completed in 0.29 seconds 223s 50 parallel clients 223s 3 bytes payload 223s keep alive: 1 223s host configuration "save": 3600 1 300 100 60 10000 223s host configuration "appendonly": no 223s multi-thread: no 223s 223s Latency by percentile distribution: 223s 0.000% <= 0.423 milliseconds (cumulative count 10) 223s 50.000% <= 1.351 milliseconds (cumulative count 50580) 223s 75.000% <= 1.503 milliseconds (cumulative count 75990) 223s 87.500% <= 1.607 milliseconds (cumulative count 88260) 223s 93.750% <= 1.679 milliseconds (cumulative count 93820) 223s 96.875% <= 1.751 milliseconds (cumulative count 97170) 223s 98.438% <= 1.823 milliseconds (cumulative count 98490) 223s 99.219% <= 1.911 milliseconds (cumulative count 99250) 223s 99.609% <= 2.015 milliseconds (cumulative count 99610) 223s 99.805% <= 2.143 milliseconds (cumulative count 99820) 223s 99.902% <= 2.231 milliseconds (cumulative count 99910) 223s 99.951% <= 2.327 milliseconds (cumulative count 99960) 223s 99.976% <= 2.359 milliseconds (cumulative count 99980) 223s 99.988% <= 2.383 milliseconds (cumulative count 99990) 223s 99.994% <= 2.399 milliseconds (cumulative count 100000) 223s 100.000% <= 2.399 milliseconds (cumulative count 100000) 223s 223s Cumulative distribution of latencies: 223s 0.000% <= 0.103 milliseconds (cumulative count 0) 223s 0.060% <= 0.503 milliseconds (cumulative count 60) 223s 0.130% <= 0.607 milliseconds (cumulative count 130) 223s 0.270% <= 0.703 milliseconds (cumulative count 270) 223s 1.000% <= 0.807 milliseconds (cumulative count 1000) 223s 4.280% <= 0.903 milliseconds (cumulative count 4280) 223s 10.700% <= 1.007 milliseconds (cumulative count 10700) 223s 17.590% <= 1.103 milliseconds (cumulative count 17590) 223s 28.290% <= 1.207 milliseconds (cumulative count 28290) 223s 42.480% <= 1.303 milliseconds (cumulative count 42480) 223s 60.260% <= 1.407 milliseconds (cumulative count 60260) 223s 75.990% <= 1.503 milliseconds (cumulative count 75990) 223s 88.260% <= 1.607 milliseconds (cumulative count 88260) 223s 95.180% <= 1.703 milliseconds (cumulative count 95180) 223s 98.270% <= 1.807 milliseconds (cumulative count 98270) 223s 99.210% <= 1.903 milliseconds (cumulative count 99210) 223s 99.580% <= 2.007 milliseconds (cumulative count 99580) 223s 99.780% <= 2.103 milliseconds (cumulative count 99780) 223s 100.000% <= 3.103 milliseconds (cumulative count 100000) 223s 223s Summary: 223s throughput summary: 344827.59 requests per second 223s latency summary (msec): 223s avg min p50 p95 p99 max 223s 1.335 0.416 1.351 1.703 1.871 2.399 224s LRANGE_100 (first 100 elements): rps=30398.4 (overall: 84777.8) avg_msec=3.271 (overall: 3.271) LRANGE_100 (first 100 elements): rps=88809.5 (overall: 87748.5) avg_msec=2.929 (overall: 3.016) LRANGE_100 (first 100 elements): rps=88888.9 (overall: 88232.3) avg_msec=2.957 (overall: 2.991) LRANGE_100 (first 100 elements): rps=85634.9 (overall: 87458.6) avg_msec=3.005 (overall: 2.995) LRANGE_100 (first 100 elements): rps=88577.1 (overall: 87716.1) avg_msec=2.943 (overall: 2.983) ====== LRANGE_100 (first 100 elements) ====== 224s 100000 requests completed in 1.14 seconds 224s 50 parallel clients 224s 3 bytes payload 224s keep alive: 1 224s host configuration "save": 3600 1 300 100 60 10000 224s host configuration "appendonly": no 224s multi-thread: no 224s 224s Latency by percentile distribution: 224s 0.000% <= 0.439 milliseconds (cumulative count 10) 224s 50.000% <= 2.919 milliseconds (cumulative count 51510) 224s 75.000% <= 3.023 milliseconds (cumulative count 75450) 224s 87.500% <= 3.135 milliseconds (cumulative count 87780) 224s 93.750% <= 3.263 milliseconds (cumulative count 93770) 224s 96.875% <= 3.455 milliseconds (cumulative count 96910) 224s 98.438% <= 3.743 milliseconds (cumulative count 98440) 224s 99.219% <= 4.671 milliseconds (cumulative count 99220) 224s 99.609% <= 7.303 milliseconds (cumulative count 99610) 224s 99.805% <= 10.119 milliseconds (cumulative count 99820) 224s 99.902% <= 10.207 milliseconds (cumulative count 99910) 224s 99.951% <= 10.231 milliseconds (cumulative count 99970) 224s 99.976% <= 10.239 milliseconds (cumulative count 99980) 224s 99.988% <= 10.279 milliseconds (cumulative count 99990) 224s 99.994% <= 10.319 milliseconds (cumulative count 100000) 224s 100.000% <= 10.319 milliseconds (cumulative count 100000) 224s 224s Cumulative distribution of latencies: 224s 0.000% <= 0.103 milliseconds (cumulative count 0) 224s 0.010% <= 0.503 milliseconds (cumulative count 10) 224s 85.360% <= 3.103 milliseconds (cumulative count 85360) 224s 98.930% <= 4.103 milliseconds (cumulative count 98930) 224s 99.280% <= 5.103 milliseconds (cumulative count 99280) 224s 99.450% <= 6.103 milliseconds (cumulative count 99450) 224s 99.590% <= 7.103 milliseconds (cumulative count 99590) 224s 99.670% <= 8.103 milliseconds (cumulative count 99670) 224s 99.740% <= 9.103 milliseconds (cumulative count 99740) 224s 99.800% <= 10.103 milliseconds (cumulative count 99800) 224s 100.000% <= 11.103 milliseconds (cumulative count 100000) 224s 224s Summary: 224s throughput summary: 87719.30 requests per second 224s latency summary (msec): 224s avg min p50 p95 p99 max 224s 2.985 0.432 2.919 3.327 4.183 10.319 229s LRANGE_300 (first 300 elements): rps=15019.9 (overall: 18125.0) avg_msec=16.181 (overall: 16.181) LRANGE_300 (first 300 elements): rps=25256.0 (overall: 22017.5) avg_msec=10.381 (overall: 12.549) LRANGE_300 (first 300 elements): rps=25394.4 (overall: 23213.0) avg_msec=10.403 (overall: 11.718) LRANGE_300 (first 300 elements): rps=23354.6 (overall: 23250.0) avg_msec=11.450 (overall: 11.648) LRANGE_300 (first 300 elements): rps=24786.0 (overall: 23574.4) avg_msec=11.021 (overall: 11.509) LRANGE_300 (first 300 elements): rps=25952.2 (overall: 23980.9) avg_msec=9.607 (overall: 11.157) LRANGE_300 (first 300 elements): rps=25562.5 (overall: 24215.8) avg_msec=10.080 (overall: 10.988) LRANGE_300 (first 300 elements): rps=25167.3 (overall: 24339.2) avg_msec=9.676 (overall: 10.812) LRANGE_300 (first 300 elements): rps=25078.7 (overall: 24423.3) avg_msec=10.414 (overall: 10.766) LRANGE_300 (first 300 elements): rps=25492.1 (overall: 24531.6) avg_msec=9.990 (overall: 10.684) LRANGE_300 (first 300 elements): rps=25649.4 (overall: 24634.0) avg_msec=9.956 (overall: 10.614) LRANGE_300 (first 300 elements): rps=25645.7 (overall: 24719.9) avg_msec=9.730 (overall: 10.537) LRANGE_300 (first 300 elements): rps=24992.0 (overall: 24740.9) avg_msec=9.987 (overall: 10.494) LRANGE_300 (first 300 elements): rps=25660.1 (overall: 24807.4) avg_msec=10.008 (overall: 10.457) LRANGE_300 (first 300 elements): rps=24566.9 (overall: 24791.1) avg_msec=10.648 (overall: 10.470) LRANGE_300 (first 300 elements): rps=25697.2 (overall: 24848.0) avg_msec=9.919 (overall: 10.434) ====== LRANGE_300 (first 300 elements) ====== 229s 100000 requests completed in 4.02 seconds 229s 50 parallel clients 229s 3 bytes payload 229s keep alive: 1 229s host configuration "save": 3600 1 300 100 60 10000 229s host configuration "appendonly": no 229s multi-thread: no 229s 229s Latency by percentile distribution: 229s 0.000% <= 1.191 milliseconds (cumulative count 10) 229s 50.000% <= 9.783 milliseconds (cumulative count 50010) 229s 75.000% <= 11.791 milliseconds (cumulative count 75010) 229s 87.500% <= 14.031 milliseconds (cumulative count 87520) 229s 93.750% <= 16.183 milliseconds (cumulative count 93760) 229s 96.875% <= 19.055 milliseconds (cumulative count 96880) 229s 98.438% <= 21.311 milliseconds (cumulative count 98450) 229s 99.219% <= 24.271 milliseconds (cumulative count 99230) 229s 99.609% <= 26.607 milliseconds (cumulative count 99610) 229s 99.805% <= 28.959 milliseconds (cumulative count 99810) 229s 99.902% <= 31.679 milliseconds (cumulative count 99910) 229s 99.951% <= 33.343 milliseconds (cumulative count 99960) 229s 99.976% <= 33.919 milliseconds (cumulative count 99980) 229s 99.988% <= 34.175 milliseconds (cumulative count 99990) 229s 99.994% <= 34.431 milliseconds (cumulative count 100000) 229s 100.000% <= 34.431 milliseconds (cumulative count 100000) 229s 229s Cumulative distribution of latencies: 229s 0.000% <= 0.103 milliseconds (cumulative count 0) 229s 0.030% <= 1.207 milliseconds (cumulative count 30) 229s 0.080% <= 1.303 milliseconds (cumulative count 80) 229s 0.140% <= 1.503 milliseconds (cumulative count 140) 229s 0.180% <= 1.607 milliseconds (cumulative count 180) 229s 0.210% <= 1.703 milliseconds (cumulative count 210) 229s 0.280% <= 1.807 milliseconds (cumulative count 280) 229s 0.310% <= 1.903 milliseconds (cumulative count 310) 229s 0.330% <= 2.007 milliseconds (cumulative count 330) 229s 0.360% <= 2.103 milliseconds (cumulative count 360) 229s 0.550% <= 3.103 milliseconds (cumulative count 550) 229s 0.770% <= 4.103 milliseconds (cumulative count 770) 229s 1.960% <= 5.103 milliseconds (cumulative count 1960) 229s 5.230% <= 6.103 milliseconds (cumulative count 5230) 229s 11.260% <= 7.103 milliseconds (cumulative count 11260) 229s 22.650% <= 8.103 milliseconds (cumulative count 22650) 229s 38.690% <= 9.103 milliseconds (cumulative count 38690) 229s 55.300% <= 10.103 milliseconds (cumulative count 55300) 229s 68.810% <= 11.103 milliseconds (cumulative count 68810) 229s 77.240% <= 12.103 milliseconds (cumulative count 77240) 229s 83.150% <= 13.103 milliseconds (cumulative count 83150) 229s 87.860% <= 14.103 milliseconds (cumulative count 87860) 229s 91.510% <= 15.103 milliseconds (cumulative count 91510) 229s 93.610% <= 16.103 milliseconds (cumulative count 93610) 229s 95.040% <= 17.103 milliseconds (cumulative count 95040) 229s 96.000% <= 18.111 milliseconds (cumulative count 96000) 229s 96.920% <= 19.103 milliseconds (cumulative count 96920) 229s 97.770% <= 20.111 milliseconds (cumulative count 97770) 229s 98.360% <= 21.103 milliseconds (cumulative count 98360) 229s 98.740% <= 22.111 milliseconds (cumulative count 98740) 229s 98.980% <= 23.103 milliseconds (cumulative count 98980) 229s 99.180% <= 24.111 milliseconds (cumulative count 99180) 229s 99.390% <= 25.103 milliseconds (cumulative count 99390) 229s 99.540% <= 26.111 milliseconds (cumulative count 99540) 229s 99.660% <= 27.103 milliseconds (cumulative count 99660) 229s 99.750% <= 28.111 milliseconds (cumulative count 99750) 229s 99.810% <= 29.103 milliseconds (cumulative count 99810) 229s 99.850% <= 30.111 milliseconds (cumulative count 99850) 229s 99.890% <= 31.103 milliseconds (cumulative count 99890) 229s 99.920% <= 32.111 milliseconds (cumulative count 99920) 229s 99.950% <= 33.119 milliseconds (cumulative count 99950) 229s 99.980% <= 34.111 milliseconds (cumulative count 99980) 229s 100.000% <= 35.103 milliseconds (cumulative count 100000) 229s 229s Summary: 229s throughput summary: 24850.89 requests per second 229s latency summary (msec): 229s avg min p50 p95 p99 max 229s 10.427 1.184 9.783 17.071 23.215 34.431 237s LRANGE_500 (first 500 elements): rps=8612.0 (overall: 9611.6) avg_msec=28.138 (overall: 28.138) LRANGE_500 (first 500 elements): rps=9840.5 (overall: 9733.9) avg_msec=26.741 (overall: 27.383) LRANGE_500 (first 500 elements): rps=12302.8 (overall: 10614.8) avg_msec=20.587 (overall: 24.682) LRANGE_500 (first 500 elements): rps=13739.1 (overall: 11417.3) avg_msec=19.393 (overall: 23.048) LRANGE_500 (first 500 elements): rps=13290.2 (overall: 11802.4) avg_msec=19.426 (overall: 22.209) LRANGE_500 (first 500 elements): rps=13303.5 (overall: 12060.1) avg_msec=19.138 (overall: 21.627) LRANGE_500 (first 500 elements): rps=13027.9 (overall: 12199.1) avg_msec=20.758 (overall: 21.494) LRANGE_500 (first 500 elements): rps=12834.6 (overall: 12279.7) avg_msec=20.408 (overall: 21.350) LRANGE_500 (first 500 elements): rps=13087.0 (overall: 12370.3) avg_msec=18.620 (overall: 21.026) LRANGE_500 (first 500 elements): rps=12827.5 (overall: 12416.7) avg_msec=20.362 (overall: 20.956) LRANGE_500 (first 500 elements): rps=13533.3 (overall: 12519.7) avg_msec=18.134 (overall: 20.675) LRANGE_500 (first 500 elements): rps=11251.0 (overall: 12412.6) avg_msec=23.023 (overall: 20.855) LRANGE_500 (first 500 elements): rps=10253.0 (overall: 12245.6) avg_msec=28.057 (overall: 21.321) LRANGE_500 (first 500 elements): rps=9846.2 (overall: 12069.1) avg_msec=26.250 (overall: 21.617) LRANGE_500 (first 500 elements): rps=10119.5 (overall: 11939.7) avg_msec=27.084 (overall: 21.924) LRANGE_500 (first 500 elements): rps=9918.9 (overall: 11810.3) avg_msec=28.435 (overall: 22.274) LRANGE_500 (first 500 elements): rps=9247.0 (overall: 11660.5) avg_msec=28.022 (overall: 22.541) LRANGE_500 (first 500 elements): rps=10229.2 (overall: 11580.8) avg_msec=28.155 (overall: 22.817) LRANGE_500 (first 500 elements): rps=9931.0 (overall: 11491.3) avg_msec=26.281 (overall: 22.979) LRANGE_500 (first 500 elements): rps=10116.0 (overall: 11423.3) avg_msec=30.073 (overall: 23.290) LRANGE_500 (first 500 elements): rps=13019.7 (overall: 11499.6) avg_msec=19.928 (overall: 23.108) LRANGE_500 (first 500 elements): rps=13964.3 (overall: 11611.3) avg_msec=18.706 (overall: 22.868) LRANGE_500 (first 500 elements): rps=10996.1 (overall: 11584.2) avg_msec=23.578 (overall: 22.898) LRANGE_500 (first 500 elements): rps=11142.3 (overall: 11565.8) avg_msec=25.151 (overall: 22.988) LRANGE_500 (first 500 elements): rps=14772.5 (overall: 11695.0) avg_msec=13.837 (overall: 22.522) LRANGE_500 (first 500 elements): rps=13090.9 (overall: 11748.7) avg_msec=16.772 (overall: 22.276) LRANGE_500 (first 500 elements): rps=14426.3 (overall: 11847.0) avg_msec=15.211 (overall: 21.960) LRANGE_500 (first 500 elements): rps=12715.4 (overall: 11878.1) avg_msec=20.580 (overall: 21.907) LRANGE_500 (first 500 elements): rps=14119.5 (overall: 11954.7) avg_msec=18.861 (overall: 21.784) LRANGE_500 (first 500 elements): rps=12218.4 (overall: 11963.8) avg_msec=21.501 (overall: 21.774) LRANGE_500 (first 500 elements): rps=10135.5 (overall: 11905.3) avg_msec=25.985 (overall: 21.889) LRANGE_500 (first 500 elements): rps=10945.1 (overall: 11875.1) avg_msec=26.641 (overall: 22.026) LRANGE_500 (first 500 elements): rps=12496.1 (overall: 11894.0) avg_msec=19.836 (overall: 21.957) ====== LRANGE_500 (first 500 elements) ====== 237s 100000 requests completed in 8.40 seconds 237s 50 parallel clients 237s 3 bytes payload 237s keep alive: 1 237s host configuration "save": 3600 1 300 100 60 10000 237s host configuration "appendonly": no 237s multi-thread: no 237s 237s Latency by percentile distribution: 237s 0.000% <= 0.631 milliseconds (cumulative count 10) 237s 50.000% <= 21.999 milliseconds (cumulative count 50030) 237s 75.000% <= 28.431 milliseconds (cumulative count 75050) 237s 87.500% <= 33.887 milliseconds (cumulative count 87500) 237s 93.750% <= 36.255 milliseconds (cumulative count 93830) 237s 96.875% <= 37.823 milliseconds (cumulative count 96930) 237s 98.438% <= 39.327 milliseconds (cumulative count 98440) 237s 99.219% <= 42.655 milliseconds (cumulative count 99220) 237s 99.609% <= 51.583 milliseconds (cumulative count 99610) 237s 99.805% <= 56.799 milliseconds (cumulative count 99810) 237s 99.902% <= 59.103 milliseconds (cumulative count 99910) 237s 99.951% <= 64.639 milliseconds (cumulative count 99960) 237s 99.976% <= 65.855 milliseconds (cumulative count 99980) 237s 99.988% <= 66.111 milliseconds (cumulative count 99990) 237s 99.994% <= 66.431 milliseconds (cumulative count 100000) 237s 100.000% <= 66.431 milliseconds (cumulative count 100000) 237s 237s Cumulative distribution of latencies: 237s 0.000% <= 0.103 milliseconds (cumulative count 0) 237s 0.010% <= 0.703 milliseconds (cumulative count 10) 237s 0.020% <= 1.007 milliseconds (cumulative count 20) 237s 0.030% <= 1.207 milliseconds (cumulative count 30) 237s 0.050% <= 1.303 milliseconds (cumulative count 50) 237s 0.060% <= 1.407 milliseconds (cumulative count 60) 237s 0.110% <= 1.503 milliseconds (cumulative count 110) 237s 0.130% <= 1.607 milliseconds (cumulative count 130) 237s 0.340% <= 1.807 milliseconds (cumulative count 340) 237s 0.420% <= 1.903 milliseconds (cumulative count 420) 237s 0.560% <= 2.007 milliseconds (cumulative count 560) 237s 0.780% <= 2.103 milliseconds (cumulative count 780) 237s 1.930% <= 3.103 milliseconds (cumulative count 1930) 237s 2.660% <= 4.103 milliseconds (cumulative count 2660) 237s 3.630% <= 5.103 milliseconds (cumulative count 3630) 237s 5.000% <= 6.103 milliseconds (cumulative count 5000) 237s 6.020% <= 7.103 milliseconds (cumulative count 6020) 237s 7.110% <= 8.103 milliseconds (cumulative count 7110) 237s 8.370% <= 9.103 milliseconds (cumulative count 8370) 237s 9.760% <= 10.103 milliseconds (cumulative count 9760) 237s 11.700% <= 11.103 milliseconds (cumulative count 11700) 237s 14.430% <= 12.103 milliseconds (cumulative count 14430) 237s 18.330% <= 13.103 milliseconds (cumulative count 18330) 237s 22.420% <= 14.103 milliseconds (cumulative count 22420) 237s 26.480% <= 15.103 milliseconds (cumulative count 26480) 237s 29.880% <= 16.103 milliseconds (cumulative count 29880) 237s 32.960% <= 17.103 milliseconds (cumulative count 32960) 237s 35.980% <= 18.111 milliseconds (cumulative count 35980) 237s 39.100% <= 19.103 milliseconds (cumulative count 39100) 237s 42.370% <= 20.111 milliseconds (cumulative count 42370) 237s 46.470% <= 21.103 milliseconds (cumulative count 46470) 237s 50.520% <= 22.111 milliseconds (cumulative count 50520) 237s 54.990% <= 23.103 milliseconds (cumulative count 54990) 237s 59.590% <= 24.111 milliseconds (cumulative count 59590) 237s 63.940% <= 25.103 milliseconds (cumulative count 63940) 237s 67.700% <= 26.111 milliseconds (cumulative count 67700) 237s 71.010% <= 27.103 milliseconds (cumulative count 71010) 237s 74.130% <= 28.111 milliseconds (cumulative count 74130) 237s 76.740% <= 29.103 milliseconds (cumulative count 76740) 237s 79.190% <= 30.111 milliseconds (cumulative count 79190) 237s 81.480% <= 31.103 milliseconds (cumulative count 81480) 237s 83.650% <= 32.111 milliseconds (cumulative count 83650) 237s 85.900% <= 33.119 milliseconds (cumulative count 85900) 237s 88.050% <= 34.111 milliseconds (cumulative count 88050) 237s 90.760% <= 35.103 milliseconds (cumulative count 90760) 237s 93.530% <= 36.127 milliseconds (cumulative count 93530) 237s 95.670% <= 37.119 milliseconds (cumulative count 95670) 237s 97.390% <= 38.111 milliseconds (cumulative count 97390) 237s 98.370% <= 39.103 milliseconds (cumulative count 98370) 237s 98.700% <= 40.127 milliseconds (cumulative count 98700) 237s 98.980% <= 41.119 milliseconds (cumulative count 98980) 237s 99.170% <= 42.111 milliseconds (cumulative count 99170) 237s 99.260% <= 43.103 milliseconds (cumulative count 99260) 237s 99.370% <= 44.127 milliseconds (cumulative count 99370) 237s 99.430% <= 45.119 milliseconds (cumulative count 99430) 237s 99.470% <= 46.111 milliseconds (cumulative count 99470) 237s 99.490% <= 47.103 milliseconds (cumulative count 99490) 237s 99.530% <= 48.127 milliseconds (cumulative count 99530) 237s 99.570% <= 49.119 milliseconds (cumulative count 99570) 237s 99.580% <= 50.111 milliseconds (cumulative count 99580) 237s 99.590% <= 51.103 milliseconds (cumulative count 99590) 237s 99.630% <= 52.127 milliseconds (cumulative count 99630) 237s 99.640% <= 53.119 milliseconds (cumulative count 99640) 237s 99.660% <= 54.111 milliseconds (cumulative count 99660) 237s 99.710% <= 55.103 milliseconds (cumulative count 99710) 237s 99.770% <= 56.127 milliseconds (cumulative count 99770) 237s 99.820% <= 57.119 milliseconds (cumulative count 99820) 237s 99.860% <= 58.111 milliseconds (cumulative count 99860) 237s 99.910% <= 59.103 milliseconds (cumulative count 99910) 237s 99.930% <= 60.127 milliseconds (cumulative count 99930) 237s 99.970% <= 65.119 milliseconds (cumulative count 99970) 237s 99.990% <= 66.111 milliseconds (cumulative count 99990) 237s 100.000% <= 67.135 milliseconds (cumulative count 100000) 237s 237s Summary: 237s throughput summary: 11903.34 requests per second 237s latency summary (msec): 237s avg min p50 p95 p99 max 237s 21.930 0.624 21.999 36.767 41.151 66.431 247s LRANGE_600 (first 600 elements): rps=8736.2 (overall: 10668.3) avg_msec=20.435 (overall: 20.435) LRANGE_600 (first 600 elements): rps=10531.5 (overall: 10593.1) avg_msec=22.817 (overall: 21.737) LRANGE_600 (first 600 elements): rps=10992.0 (overall: 10733.5) avg_msec=24.311 (overall: 22.665) LRANGE_600 (first 600 elements): rps=11596.8 (overall: 10959.6) avg_msec=20.383 (overall: 22.033) LRANGE_600 (first 600 elements): rps=12341.2 (overall: 11248.2) avg_msec=14.440 (overall: 20.293) LRANGE_600 (first 600 elements): rps=11224.0 (overall: 11244.1) avg_msec=20.374 (overall: 20.307) LRANGE_600 (first 600 elements): rps=9960.2 (overall: 11056.9) avg_msec=25.567 (overall: 20.997) LRANGE_600 (first 600 elements): rps=8300.8 (overall: 10700.2) avg_msec=32.186 (overall: 22.121) LRANGE_600 (first 600 elements): rps=8245.1 (overall: 10417.9) avg_msec=31.316 (overall: 22.958) LRANGE_600 (first 600 elements): rps=8272.7 (overall: 10199.8) avg_msec=31.087 (overall: 23.628) LRANGE_600 (first 600 elements): rps=8011.9 (overall: 9997.8) avg_msec=33.027 (overall: 24.323) LRANGE_600 (first 600 elements): rps=11210.3 (overall: 10099.9) avg_msec=22.471 (overall: 24.150) LRANGE_600 (first 600 elements): rps=8678.3 (overall: 9987.1) avg_msec=28.978 (overall: 24.483) LRANGE_600 (first 600 elements): rps=8255.0 (overall: 9862.9) avg_msec=32.144 (overall: 24.943) LRANGE_600 (first 600 elements): rps=8844.6 (overall: 9794.8) avg_msec=32.420 (overall: 25.394) LRANGE_600 (first 600 elements): rps=8155.6 (overall: 9689.8) avg_msec=32.289 (overall: 25.766) LRANGE_600 (first 600 elements): rps=8395.3 (overall: 9612.9) avg_msec=30.660 (overall: 26.020) LRANGE_600 (first 600 elements): rps=7806.2 (overall: 9509.8) avg_msec=32.744 (overall: 26.335) LRANGE_600 (first 600 elements): rps=7912.4 (overall: 9425.8) avg_msec=31.893 (overall: 26.580) LRANGE_600 (first 600 elements): rps=8725.1 (overall: 9390.8) avg_msec=31.641 (overall: 26.815) LRANGE_600 (first 600 elements): rps=9884.5 (overall: 9414.3) avg_msec=25.818 (overall: 26.765) LRANGE_600 (first 600 elements): rps=11613.3 (overall: 9516.1) avg_msec=20.494 (overall: 26.411) LRANGE_600 (first 600 elements): rps=11256.9 (overall: 9592.3) avg_msec=21.539 (overall: 26.161) LRANGE_600 (first 600 elements): rps=10486.2 (overall: 9629.7) avg_msec=23.660 (overall: 26.047) LRANGE_600 (first 600 elements): rps=8286.3 (overall: 9575.3) avg_msec=30.991 (overall: 26.220) LRANGE_600 (first 600 elements): rps=8146.2 (overall: 9520.0) avg_msec=31.416 (overall: 26.392) LRANGE_600 (first 600 elements): rps=8086.3 (overall: 9466.2) avg_msec=31.899 (overall: 26.569) LRANGE_600 (first 600 elements): rps=8442.2 (overall: 9429.8) avg_msec=31.783 (overall: 26.735) LRANGE_600 (first 600 elements): rps=8685.3 (overall: 9404.2) avg_msec=32.424 (overall: 26.915) LRANGE_600 (first 600 elements): rps=8565.7 (overall: 9376.3) avg_msec=32.157 (overall: 27.075) LRANGE_600 (first 600 elements): rps=8269.8 (overall: 9340.6) avg_msec=30.954 (overall: 27.185) LRANGE_600 (first 600 elements): rps=7945.3 (overall: 9296.3) avg_msec=31.969 (overall: 27.315) LRANGE_600 (first 600 elements): rps=10492.1 (overall: 9332.5) avg_msec=24.884 (overall: 27.232) LRANGE_600 (first 600 elements): rps=10749.0 (overall: 9374.1) avg_msec=22.910 (overall: 27.087) LRANGE_600 (first 600 elements): rps=9496.0 (overall: 9377.5) avg_msec=28.421 (overall: 27.125) LRANGE_600 (first 600 elements): rps=8239.4 (overall: 9345.0) avg_msec=31.462 (overall: 27.235) LRANGE_600 (first 600 elements): rps=8226.1 (overall: 9313.7) avg_msec=32.013 (overall: 27.353) LRANGE_600 (first 600 elements): rps=9150.2 (overall: 9309.4) avg_msec=29.502 (overall: 27.408) LRANGE_600 (first 600 elements): rps=10992.1 (overall: 9352.5) avg_msec=22.512 (overall: 27.261) LRANGE_600 (first 600 elements): rps=10691.7 (overall: 9386.1) avg_msec=24.072 (overall: 27.170) LRANGE_600 (first 600 elements): rps=11910.2 (overall: 9448.5) avg_msec=20.901 (overall: 26.974) ====== LRANGE_600 (first 600 elements) ====== 247s 100000 requests completed in 10.54 seconds 247s 50 parallel clients 247s 3 bytes payload 247s keep alive: 1 247s host configuration "save": 3600 1 300 100 60 10000 247s host configuration "appendonly": no 247s multi-thread: no 247s 247s Latency by percentile distribution: 247s 0.000% <= 1.295 milliseconds (cumulative count 10) 247s 50.000% <= 28.271 milliseconds (cumulative count 50000) 247s 75.000% <= 36.575 milliseconds (cumulative count 75030) 247s 87.500% <= 40.063 milliseconds (cumulative count 87510) 247s 93.750% <= 41.855 milliseconds (cumulative count 93770) 247s 96.875% <= 43.167 milliseconds (cumulative count 96890) 247s 98.438% <= 44.351 milliseconds (cumulative count 98450) 247s 99.219% <= 45.727 milliseconds (cumulative count 99220) 247s 99.609% <= 46.655 milliseconds (cumulative count 99630) 247s 99.805% <= 47.647 milliseconds (cumulative count 99810) 247s 99.902% <= 48.127 milliseconds (cumulative count 99910) 247s 99.951% <= 48.863 milliseconds (cumulative count 99960) 247s 99.976% <= 49.119 milliseconds (cumulative count 99980) 247s 99.988% <= 49.343 milliseconds (cumulative count 99990) 247s 99.994% <= 49.631 milliseconds (cumulative count 100000) 247s 100.000% <= 49.631 milliseconds (cumulative count 100000) 247s 247s Cumulative distribution of latencies: 247s 0.000% <= 0.103 milliseconds (cumulative count 0) 247s 0.020% <= 1.303 milliseconds (cumulative count 20) 247s 0.030% <= 1.407 milliseconds (cumulative count 30) 247s 0.070% <= 1.607 milliseconds (cumulative count 70) 247s 0.110% <= 1.703 milliseconds (cumulative count 110) 247s 0.180% <= 1.807 milliseconds (cumulative count 180) 247s 0.280% <= 1.903 milliseconds (cumulative count 280) 247s 0.430% <= 2.007 milliseconds (cumulative count 430) 247s 0.690% <= 2.103 milliseconds (cumulative count 690) 247s 3.850% <= 3.103 milliseconds (cumulative count 3850) 247s 5.010% <= 4.103 milliseconds (cumulative count 5010) 247s 5.450% <= 5.103 milliseconds (cumulative count 5450) 247s 6.000% <= 6.103 milliseconds (cumulative count 6000) 247s 6.450% <= 7.103 milliseconds (cumulative count 6450) 247s 7.190% <= 8.103 milliseconds (cumulative count 7190) 247s 8.020% <= 9.103 milliseconds (cumulative count 8020) 247s 9.110% <= 10.103 milliseconds (cumulative count 9110) 247s 10.180% <= 11.103 milliseconds (cumulative count 10180) 247s 11.490% <= 12.103 milliseconds (cumulative count 11490) 247s 13.170% <= 13.103 milliseconds (cumulative count 13170) 247s 15.230% <= 14.103 milliseconds (cumulative count 15230) 247s 17.850% <= 15.103 milliseconds (cumulative count 17850) 247s 20.310% <= 16.103 milliseconds (cumulative count 20310) 247s 22.450% <= 17.103 milliseconds (cumulative count 22450) 247s 24.690% <= 18.111 milliseconds (cumulative count 24690) 247s 26.800% <= 19.103 milliseconds (cumulative count 26800) 247s 28.780% <= 20.111 milliseconds (cumulative count 28780) 247s 30.430% <= 21.103 milliseconds (cumulative count 30430) 247s 31.990% <= 22.111 milliseconds (cumulative count 31990) 247s 33.740% <= 23.103 milliseconds (cumulative count 33740) 247s 35.880% <= 24.111 milliseconds (cumulative count 35880) 247s 38.620% <= 25.103 milliseconds (cumulative count 38620) 247s 41.580% <= 26.111 milliseconds (cumulative count 41580) 247s 45.200% <= 27.103 milliseconds (cumulative count 45200) 247s 49.290% <= 28.111 milliseconds (cumulative count 49290) 247s 53.390% <= 29.103 milliseconds (cumulative count 53390) 247s 57.510% <= 30.111 milliseconds (cumulative count 57510) 247s 61.650% <= 31.103 milliseconds (cumulative count 61650) 247s 65.410% <= 32.111 milliseconds (cumulative count 65410) 247s 67.900% <= 33.119 milliseconds (cumulative count 67900) 247s 70.080% <= 34.111 milliseconds (cumulative count 70080) 247s 71.940% <= 35.103 milliseconds (cumulative count 71940) 247s 74.040% <= 36.127 milliseconds (cumulative count 74040) 247s 76.270% <= 37.119 milliseconds (cumulative count 76270) 247s 79.150% <= 38.111 milliseconds (cumulative count 79150) 247s 83.320% <= 39.103 milliseconds (cumulative count 83320) 247s 87.860% <= 40.127 milliseconds (cumulative count 87860) 247s 91.520% <= 41.119 milliseconds (cumulative count 91520) 247s 94.470% <= 42.111 milliseconds (cumulative count 94470) 247s 96.810% <= 43.103 milliseconds (cumulative count 96810) 247s 98.250% <= 44.127 milliseconds (cumulative count 98250) 247s 98.900% <= 45.119 milliseconds (cumulative count 98900) 247s 99.400% <= 46.111 milliseconds (cumulative count 99400) 247s 99.700% <= 47.103 milliseconds (cumulative count 99700) 247s 99.910% <= 48.127 milliseconds (cumulative count 99910) 247s 99.980% <= 49.119 milliseconds (cumulative count 99980) 247s 100.000% <= 50.111 milliseconds (cumulative count 100000) 247s 247s Summary: 247s throughput summary: 9489.47 requests per second 247s latency summary (msec): 247s avg min p50 p95 p99 max 247s 26.840 1.288 28.271 42.335 45.311 49.631 248s MSET (10 keys): rps=37888.4 (overall: 166842.1) avg_msec=2.738 (overall: 2.738) MSET (10 keys): rps=175640.0 (overall: 174006.5) avg_msec=2.719 (overall: 2.722) MSET (10 keys): rps=171400.0 (overall: 172836.6) avg_msec=2.769 (overall: 2.743) ====== MSET (10 keys) ====== 248s 100000 requests completed in 0.58 seconds 248s 50 parallel clients 248s 3 bytes payload 248s keep alive: 1 248s host configuration "save": 3600 1 300 100 60 10000 248s host configuration "appendonly": no 248s multi-thread: no 248s 248s Latency by percentile distribution: 248s 0.000% <= 0.527 milliseconds (cumulative count 10) 248s 50.000% <= 2.791 milliseconds (cumulative count 51190) 248s 75.000% <= 2.951 milliseconds (cumulative count 75000) 248s 87.500% <= 3.071 milliseconds (cumulative count 88180) 248s 93.750% <= 3.151 milliseconds (cumulative count 93760) 248s 96.875% <= 3.239 milliseconds (cumulative count 96930) 248s 98.438% <= 3.343 milliseconds (cumulative count 98450) 248s 99.219% <= 3.487 milliseconds (cumulative count 99250) 248s 99.609% <= 3.623 milliseconds (cumulative count 99610) 248s 99.805% <= 3.847 milliseconds (cumulative count 99810) 248s 99.902% <= 4.071 milliseconds (cumulative count 99910) 248s 99.951% <= 4.239 milliseconds (cumulative count 99960) 248s 99.976% <= 4.287 milliseconds (cumulative count 99980) 248s 99.988% <= 4.335 milliseconds (cumulative count 99990) 248s 99.994% <= 4.519 milliseconds (cumulative count 100000) 248s 100.000% <= 4.519 milliseconds (cumulative count 100000) 248s 248s Cumulative distribution of latencies: 248s 0.000% <= 0.103 milliseconds (cumulative count 0) 248s 0.030% <= 0.607 milliseconds (cumulative count 30) 248s 0.060% <= 1.207 milliseconds (cumulative count 60) 248s 0.100% <= 1.407 milliseconds (cumulative count 100) 248s 0.370% <= 1.503 milliseconds (cumulative count 370) 248s 1.130% <= 1.607 milliseconds (cumulative count 1130) 248s 3.430% <= 1.703 milliseconds (cumulative count 3430) 248s 5.430% <= 1.807 milliseconds (cumulative count 5430) 248s 6.450% <= 1.903 milliseconds (cumulative count 6450) 248s 6.750% <= 2.007 milliseconds (cumulative count 6750) 248s 6.910% <= 2.103 milliseconds (cumulative count 6910) 248s 90.720% <= 3.103 milliseconds (cumulative count 90720) 248s 99.930% <= 4.103 milliseconds (cumulative count 99930) 248s 100.000% <= 5.103 milliseconds (cumulative count 100000) 248s 248s Summary: 248s throughput summary: 173010.38 requests per second 248s latency summary (msec): 248s avg min p50 p95 p99 max 248s 2.741 0.520 2.791 3.183 3.431 4.519 248s XADD: rps=258725.1 (overall: 284824.6) avg_msec=1.615 (overall: 1.615) ====== XADD ====== 248s 100000 requests completed in 0.35 seconds 248s 50 parallel clients 248s 3 bytes payload 248s keep alive: 1 248s host configuration "save": 3600 1 300 100 60 10000 248s host configuration "appendonly": no 248s multi-thread: no 248s 248s Latency by percentile distribution: 248s 0.000% <= 0.535 milliseconds (cumulative count 10) 248s 50.000% <= 1.623 milliseconds (cumulative count 50730) 248s 75.000% <= 1.767 milliseconds (cumulative count 75660) 248s 87.500% <= 1.863 milliseconds (cumulative count 88210) 248s 93.750% <= 1.927 milliseconds (cumulative count 93840) 248s 96.875% <= 1.991 milliseconds (cumulative count 97110) 248s 98.438% <= 2.047 milliseconds (cumulative count 98540) 248s 99.219% <= 2.103 milliseconds (cumulative count 99250) 248s 99.609% <= 2.159 milliseconds (cumulative count 99650) 248s 99.805% <= 2.207 milliseconds (cumulative count 99830) 248s 99.902% <= 2.239 milliseconds (cumulative count 99910) 248s 99.951% <= 2.271 milliseconds (cumulative count 99960) 248s 99.976% <= 2.287 milliseconds (cumulative count 99980) 248s 99.988% <= 2.303 milliseconds (cumulative count 99990) 248s 99.994% <= 2.311 milliseconds (cumulative count 100000) 248s 100.000% <= 2.311 milliseconds (cumulative count 100000) 248s 248s Cumulative distribution of latencies: 248s 0.000% <= 0.103 milliseconds (cumulative count 0) 248s 0.020% <= 0.607 milliseconds (cumulative count 20) 248s 0.050% <= 0.703 milliseconds (cumulative count 50) 248s 0.130% <= 0.807 milliseconds (cumulative count 130) 248s 0.200% <= 0.903 milliseconds (cumulative count 200) 248s 0.970% <= 1.007 milliseconds (cumulative count 970) 248s 3.890% <= 1.103 milliseconds (cumulative count 3890) 248s 7.210% <= 1.207 milliseconds (cumulative count 7210) 248s 10.510% <= 1.303 milliseconds (cumulative count 10510) 248s 18.380% <= 1.407 milliseconds (cumulative count 18380) 248s 30.210% <= 1.503 milliseconds (cumulative count 30210) 248s 47.800% <= 1.607 milliseconds (cumulative count 47800) 248s 65.210% <= 1.703 milliseconds (cumulative count 65210) 248s 81.450% <= 1.807 milliseconds (cumulative count 81450) 248s 92.020% <= 1.903 milliseconds (cumulative count 92020) 248s 97.670% <= 2.007 milliseconds (cumulative count 97670) 248s 99.250% <= 2.103 milliseconds (cumulative count 99250) 248s 100.000% <= 3.103 milliseconds (cumulative count 100000) 248s 248s Summary: 248s throughput summary: 287356.34 requests per second 248s latency summary (msec): 248s avg min p50 p95 p99 max 248s 1.601 0.528 1.623 1.951 2.079 2.311 248s 249s autopkgtest [23:54:03]: test 0002-benchmark: -----------------------] 252s 0002-benchmark PASS 252s autopkgtest [23:54:06]: test 0002-benchmark: - - - - - - - - - - results - - - - - - - - - - 256s autopkgtest [23:54:10]: test 0003-valkey-check-aof: preparing testbed 258s Reading package lists... 258s Building dependency tree... 258s Reading state information... 258s Starting pkgProblemResolver with broken count: 0 258s Starting 2 pkgProblemResolver with broken count: 0 258s Done 259s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 266s autopkgtest [23:54:20]: test 0003-valkey-check-aof: [----------------------- 268s autopkgtest [23:54:22]: test 0003-valkey-check-aof: -----------------------] 272s autopkgtest [23:54:26]: test 0003-valkey-check-aof: - - - - - - - - - - results - - - - - - - - - - 272s 0003-valkey-check-aof PASS 275s autopkgtest [23:54:29]: test 0004-valkey-check-rdb: preparing testbed 277s Reading package lists... 277s Building dependency tree... 277s Reading state information... 278s Starting pkgProblemResolver with broken count: 0 278s Starting 2 pkgProblemResolver with broken count: 0 278s Done 279s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 286s autopkgtest [23:54:40]: test 0004-valkey-check-rdb: [----------------------- 293s OK 293s [offset 0] Checking RDB file /var/lib/valkey/dump.rdb 293s [offset 27] AUX FIELD valkey-ver = '7.2.8' 293s [offset 41] AUX FIELD redis-bits = '32' 293s [offset 53] AUX FIELD ctime = '1751586887' 293s [offset 68] AUX FIELD used-mem = '2825088' 293s [offset 80] AUX FIELD aof-base = '0' 293s [offset 82] Selecting DB ID 0 293s [offset 566317] Checksum OK 293s [offset 566317] \o/ RDB looks OK! \o/ 293s [info] 5 keys read 293s [info] 0 expires 293s [info] 0 already expired 293s autopkgtest [23:54:47]: test 0004-valkey-check-rdb: -----------------------] 297s autopkgtest [23:54:51]: test 0004-valkey-check-rdb: - - - - - - - - - - results - - - - - - - - - - 297s 0004-valkey-check-rdb PASS 300s autopkgtest [23:54:54]: test 0005-cjson: preparing testbed 302s Reading package lists... 302s Building dependency tree... 302s Reading state information... 303s Starting pkgProblemResolver with broken count: 0 303s Starting 2 pkgProblemResolver with broken count: 0 303s Done 304s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 311s autopkgtest [23:55:05]: test 0005-cjson: [----------------------- 318s 318s autopkgtest [23:55:12]: test 0005-cjson: -----------------------] 322s autopkgtest [23:55:16]: test 0005-cjson: - - - - - - - - - - results - - - - - - - - - - 322s 0005-cjson PASS 326s autopkgtest [23:55:20]: test 0006-migrate-from-redis: preparing testbed 352s autopkgtest [23:55:46]: testbed dpkg architecture: armhf 353s autopkgtest [23:55:47]: testbed apt version: 2.8.3 357s autopkgtest [23:55:51]: @@@@@@@@@@@@@@@@@@@@ test bed setup 359s autopkgtest [23:55:53]: testbed release detected to be: noble 366s autopkgtest [23:56:00]: updating testbed package index (apt update) 367s Get:1 http://ftpmaster.internal/ubuntu noble-proposed InRelease [265 kB] 368s Hit:2 http://ftpmaster.internal/ubuntu noble InRelease 368s Get:3 http://ftpmaster.internal/ubuntu noble-updates InRelease [126 kB] 368s Get:4 http://ftpmaster.internal/ubuntu noble-security InRelease [126 kB] 368s Get:5 http://ftpmaster.internal/ubuntu noble-proposed/main Sources [64.5 kB] 368s Get:6 http://ftpmaster.internal/ubuntu noble-proposed/multiverse Sources [3948 B] 368s Get:7 http://ftpmaster.internal/ubuntu noble-proposed/restricted Sources [28.9 kB] 368s Get:8 http://ftpmaster.internal/ubuntu noble-proposed/universe Sources [63.8 kB] 368s Get:9 http://ftpmaster.internal/ubuntu noble-proposed/main armhf Packages [98.9 kB] 369s Get:10 http://ftpmaster.internal/ubuntu noble-proposed/main armhf c-n-f Metadata [2252 B] 369s Get:11 http://ftpmaster.internal/ubuntu noble-proposed/restricted armhf Packages [2720 B] 369s Get:12 http://ftpmaster.internal/ubuntu noble-proposed/restricted armhf c-n-f Metadata [116 B] 369s Get:13 http://ftpmaster.internal/ubuntu noble-proposed/universe armhf Packages [276 kB] 369s Get:14 http://ftpmaster.internal/ubuntu noble-proposed/universe armhf c-n-f Metadata [2608 B] 369s Get:15 http://ftpmaster.internal/ubuntu noble-proposed/multiverse armhf Packages [752 B] 369s Get:16 http://ftpmaster.internal/ubuntu noble-proposed/multiverse armhf c-n-f Metadata [116 B] 369s Get:17 http://ftpmaster.internal/ubuntu noble-updates/multiverse Sources [16.0 kB] 369s Get:18 http://ftpmaster.internal/ubuntu noble-updates/main Sources [429 kB] 369s Get:19 http://ftpmaster.internal/ubuntu noble-updates/restricted Sources [44.7 kB] 369s Get:20 http://ftpmaster.internal/ubuntu noble-updates/universe Sources [441 kB] 369s Get:21 http://ftpmaster.internal/ubuntu noble-updates/main armhf Packages [606 kB] 370s Get:22 http://ftpmaster.internal/ubuntu noble-updates/universe armhf Packages [861 kB] 370s Get:23 http://ftpmaster.internal/ubuntu noble-updates/multiverse armhf Packages [2964 B] 370s Get:24 http://ftpmaster.internal/ubuntu noble-security/multiverse Sources [10.2 kB] 370s Get:25 http://ftpmaster.internal/ubuntu noble-security/universe Sources [314 kB] 370s Get:26 http://ftpmaster.internal/ubuntu noble-security/restricted Sources [41.5 kB] 370s Get:27 http://ftpmaster.internal/ubuntu noble-security/main Sources [189 kB] 370s Get:28 http://ftpmaster.internal/ubuntu noble-security/main armhf Packages [374 kB] 370s Get:29 http://ftpmaster.internal/ubuntu noble-security/universe armhf Packages [641 kB] 370s Get:30 http://ftpmaster.internal/ubuntu noble-security/multiverse armhf Packages [2228 B] 372s Fetched 5035 kB in 3s (1585 kB/s) 374s Reading package lists... 379s autopkgtest [23:56:13]: upgrading testbed (apt dist-upgrade and autopurge) 381s Reading package lists... 381s Building dependency tree... 381s Reading state information... 381s Calculating upgrade...Starting pkgProblemResolver with broken count: 0 381s Starting 2 pkgProblemResolver with broken count: 0 381s Done 383s Entering ResolveByKeep 383s 383s The following packages were automatically installed and are no longer required: 383s linux-headers-6.8.0-62 linux-headers-6.8.0-62-generic 383s Use 'apt autoremove' to remove them. 383s The following NEW packages will be installed: 383s linux-headers-6.8.0-63 linux-headers-6.8.0-63-generic 383s The following packages will be upgraded: 383s fwupd gzip libfwupd2 libnetplan1 libnss-systemd libpam-systemd 383s libsystemd-shared libsystemd0 libudev1 linux-headers-generic 383s netplan-generator netplan.io openssh-client openssh-server 383s openssh-sftp-server python3-netplan sudo systemd systemd-dev 383s systemd-resolved systemd-sysv systemd-timesyncd udev 383s 23 upgraded, 2 newly installed, 0 to remove and 0 not upgraded. 383s Need to get 31.5 MB of archives. 383s After this operation, 92.6 MB of additional disk space will be used. 383s Get:1 http://ftpmaster.internal/ubuntu noble-updates/main armhf gzip armhf 1.12-1ubuntu3.1 [96.0 kB] 383s Get:2 http://ftpmaster.internal/ubuntu noble-proposed/main armhf libnss-systemd armhf 255.4-1ubuntu8.10 [148 kB] 384s Get:3 http://ftpmaster.internal/ubuntu noble-proposed/main armhf systemd-dev all 255.4-1ubuntu8.10 [105 kB] 384s Get:4 http://ftpmaster.internal/ubuntu noble-proposed/main armhf systemd-timesyncd armhf 255.4-1ubuntu8.10 [36.0 kB] 384s Get:5 http://ftpmaster.internal/ubuntu noble-proposed/main armhf systemd-resolved armhf 255.4-1ubuntu8.10 [289 kB] 384s Get:6 http://ftpmaster.internal/ubuntu noble-proposed/main armhf libsystemd-shared armhf 255.4-1ubuntu8.10 [2013 kB] 384s Get:7 http://ftpmaster.internal/ubuntu noble-proposed/main armhf libsystemd0 armhf 255.4-1ubuntu8.10 [408 kB] 384s Get:8 http://ftpmaster.internal/ubuntu noble-proposed/main armhf systemd-sysv armhf 255.4-1ubuntu8.10 [11.9 kB] 384s Get:9 http://ftpmaster.internal/ubuntu noble-proposed/main armhf libpam-systemd armhf 255.4-1ubuntu8.10 [216 kB] 384s Get:10 http://ftpmaster.internal/ubuntu noble-proposed/main armhf systemd armhf 255.4-1ubuntu8.10 [3506 kB] 385s Get:11 http://ftpmaster.internal/ubuntu noble-proposed/main armhf udev armhf 255.4-1ubuntu8.10 [1852 kB] 385s Get:12 http://ftpmaster.internal/ubuntu noble-proposed/main armhf libudev1 armhf 255.4-1ubuntu8.10 [168 kB] 385s Get:13 http://ftpmaster.internal/ubuntu noble-proposed/main armhf openssh-sftp-server armhf 1:9.6p1-3ubuntu13.13 [35.5 kB] 385s Get:14 http://ftpmaster.internal/ubuntu noble-proposed/main armhf openssh-server armhf 1:9.6p1-3ubuntu13.13 [505 kB] 385s Get:15 http://ftpmaster.internal/ubuntu noble-proposed/main armhf openssh-client armhf 1:9.6p1-3ubuntu13.13 [891 kB] 386s Get:16 http://ftpmaster.internal/ubuntu noble-proposed/main armhf python3-netplan armhf 1.1.2-2~ubuntu24.04.2 [24.1 kB] 386s Get:17 http://ftpmaster.internal/ubuntu noble-proposed/main armhf netplan-generator armhf 1.1.2-2~ubuntu24.04.2 [60.7 kB] 386s Get:18 http://ftpmaster.internal/ubuntu noble-proposed/main armhf netplan.io armhf 1.1.2-2~ubuntu24.04.2 [68.7 kB] 386s Get:19 http://ftpmaster.internal/ubuntu noble-proposed/main armhf libnetplan1 armhf 1.1.2-2~ubuntu24.04.2 [123 kB] 386s Get:20 http://ftpmaster.internal/ubuntu noble-updates/main armhf sudo armhf 1.9.15p5-3ubuntu5.24.04.1 [937 kB] 386s Get:21 http://ftpmaster.internal/ubuntu noble-updates/main armhf libfwupd2 armhf 1.9.30-0ubuntu1~24.04.1 [126 kB] 386s Get:22 http://ftpmaster.internal/ubuntu noble-updates/main armhf fwupd armhf 1.9.30-0ubuntu1~24.04.1 [4410 kB] 386s Get:23 http://ftpmaster.internal/ubuntu noble-updates/main armhf linux-headers-6.8.0-63 all 6.8.0-63.66 [13.9 MB] 387s Get:24 http://ftpmaster.internal/ubuntu noble-updates/main armhf linux-headers-6.8.0-63-generic armhf 6.8.0-63.66 [1570 kB] 387s Get:25 http://ftpmaster.internal/ubuntu noble-updates/main armhf linux-headers-generic armhf 6.8.0-63.66 [10.5 kB] 387s Preconfiguring packages ... 387s Fetched 31.5 MB in 4s (8293 kB/s) 387s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 58042 files and directories currently installed.) 387s Preparing to unpack .../gzip_1.12-1ubuntu3.1_armhf.deb ... 388s Unpacking gzip (1.12-1ubuntu3.1) over (1.12-1ubuntu3) ... 388s Setting up gzip (1.12-1ubuntu3.1) ... 388s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 58042 files and directories currently installed.) 388s Preparing to unpack .../0-libnss-systemd_255.4-1ubuntu8.10_armhf.deb ... 388s Unpacking libnss-systemd:armhf (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 388s Preparing to unpack .../1-systemd-dev_255.4-1ubuntu8.10_all.deb ... 388s Unpacking systemd-dev (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 388s Preparing to unpack .../2-systemd-timesyncd_255.4-1ubuntu8.10_armhf.deb ... 388s Unpacking systemd-timesyncd (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 388s Preparing to unpack .../3-systemd-resolved_255.4-1ubuntu8.10_armhf.deb ... 388s Unpacking systemd-resolved (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 388s Preparing to unpack .../4-libsystemd-shared_255.4-1ubuntu8.10_armhf.deb ... 388s Unpacking libsystemd-shared:armhf (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 388s Preparing to unpack .../5-libsystemd0_255.4-1ubuntu8.10_armhf.deb ... 388s Unpacking libsystemd0:armhf (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 388s Setting up libsystemd0:armhf (255.4-1ubuntu8.10) ... 388s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 58042 files and directories currently installed.) 388s Preparing to unpack .../systemd-sysv_255.4-1ubuntu8.10_armhf.deb ... 388s Unpacking systemd-sysv (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 388s Preparing to unpack .../libpam-systemd_255.4-1ubuntu8.10_armhf.deb ... 388s Unpacking libpam-systemd:armhf (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 388s Preparing to unpack .../systemd_255.4-1ubuntu8.10_armhf.deb ... 388s Unpacking systemd (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 388s Preparing to unpack .../udev_255.4-1ubuntu8.10_armhf.deb ... 389s Unpacking udev (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 389s Preparing to unpack .../libudev1_255.4-1ubuntu8.10_armhf.deb ... 389s Unpacking libudev1:armhf (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 389s Setting up libudev1:armhf (255.4-1ubuntu8.10) ... 389s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 58042 files and directories currently installed.) 389s Preparing to unpack .../00-openssh-sftp-server_1%3a9.6p1-3ubuntu13.13_armhf.deb ... 389s Unpacking openssh-sftp-server (1:9.6p1-3ubuntu13.13) over (1:9.6p1-3ubuntu13.12) ... 389s Preparing to unpack .../01-openssh-server_1%3a9.6p1-3ubuntu13.13_armhf.deb ... 389s Unpacking openssh-server (1:9.6p1-3ubuntu13.13) over (1:9.6p1-3ubuntu13.12) ... 389s Preparing to unpack .../02-openssh-client_1%3a9.6p1-3ubuntu13.13_armhf.deb ... 389s Unpacking openssh-client (1:9.6p1-3ubuntu13.13) over (1:9.6p1-3ubuntu13.12) ... 389s Preparing to unpack .../03-python3-netplan_1.1.2-2~ubuntu24.04.2_armhf.deb ... 389s Unpacking python3-netplan (1.1.2-2~ubuntu24.04.2) over (1.1.2-2~ubuntu24.04.1) ... 389s Preparing to unpack .../04-netplan-generator_1.1.2-2~ubuntu24.04.2_armhf.deb ... 389s Adding 'diversion of /lib/systemd/system-generators/netplan to /lib/systemd/system-generators/netplan.usr-is-merged by netplan-generator' 389s Unpacking netplan-generator (1.1.2-2~ubuntu24.04.2) over (1.1.2-2~ubuntu24.04.1) ... 389s Preparing to unpack .../05-netplan.io_1.1.2-2~ubuntu24.04.2_armhf.deb ... 389s Unpacking netplan.io (1.1.2-2~ubuntu24.04.2) over (1.1.2-2~ubuntu24.04.1) ... 389s Preparing to unpack .../06-libnetplan1_1.1.2-2~ubuntu24.04.2_armhf.deb ... 389s Unpacking libnetplan1:armhf (1.1.2-2~ubuntu24.04.2) over (1.1.2-2~ubuntu24.04.1) ... 389s Preparing to unpack .../07-sudo_1.9.15p5-3ubuntu5.24.04.1_armhf.deb ... 389s Unpacking sudo (1.9.15p5-3ubuntu5.24.04.1) over (1.9.15p5-3ubuntu5) ... 389s Preparing to unpack .../08-libfwupd2_1.9.30-0ubuntu1~24.04.1_armhf.deb ... 389s Unpacking libfwupd2:armhf (1.9.30-0ubuntu1~24.04.1) over (1.9.29-0ubuntu1~24.04.1ubuntu1) ... 389s Preparing to unpack .../09-fwupd_1.9.30-0ubuntu1~24.04.1_armhf.deb ... 390s Unpacking fwupd (1.9.30-0ubuntu1~24.04.1) over (1.9.29-0ubuntu1~24.04.1ubuntu1) ... 390s Selecting previously unselected package linux-headers-6.8.0-63. 390s Preparing to unpack .../10-linux-headers-6.8.0-63_6.8.0-63.66_all.deb ... 390s Unpacking linux-headers-6.8.0-63 (6.8.0-63.66) ... 393s Selecting previously unselected package linux-headers-6.8.0-63-generic. 393s Preparing to unpack .../11-linux-headers-6.8.0-63-generic_6.8.0-63.66_armhf.deb ... 393s Unpacking linux-headers-6.8.0-63-generic (6.8.0-63.66) ... 394s Preparing to unpack .../12-linux-headers-generic_6.8.0-63.66_armhf.deb ... 394s Unpacking linux-headers-generic (6.8.0-63.66) over (6.8.0-62.65) ... 394s Setting up linux-headers-6.8.0-63 (6.8.0-63.66) ... 394s Setting up openssh-client (1:9.6p1-3ubuntu13.13) ... 394s Setting up libfwupd2:armhf (1.9.30-0ubuntu1~24.04.1) ... 394s Setting up systemd-dev (255.4-1ubuntu8.10) ... 394s Setting up libnetplan1:armhf (1.1.2-2~ubuntu24.04.2) ... 394s Setting up libsystemd-shared:armhf (255.4-1ubuntu8.10) ... 394s Setting up sudo (1.9.15p5-3ubuntu5.24.04.1) ... 394s Setting up linux-headers-6.8.0-63-generic (6.8.0-63.66) ... 394s Setting up python3-netplan (1.1.2-2~ubuntu24.04.2) ... 394s Setting up openssh-sftp-server (1:9.6p1-3ubuntu13.13) ... 394s Setting up openssh-server (1:9.6p1-3ubuntu13.13) ... 396s Setting up systemd (255.4-1ubuntu8.10) ... 396s Setting up linux-headers-generic (6.8.0-63.66) ... 396s Setting up systemd-timesyncd (255.4-1ubuntu8.10) ... 396s Setting up udev (255.4-1ubuntu8.10) ... 397s Setting up netplan-generator (1.1.2-2~ubuntu24.04.2) ... 397s Removing 'diversion of /lib/systemd/system-generators/netplan to /lib/systemd/system-generators/netplan.usr-is-merged by netplan-generator' 397s Setting up fwupd (1.9.30-0ubuntu1~24.04.1) ... 398s fwupd-offline-update.service is a disabled or a static unit not running, not starting it. 398s fwupd-refresh.service is a disabled or a static unit not running, not starting it. 398s fwupd.service is a disabled or a static unit not running, not starting it. 398s Setting up systemd-resolved (255.4-1ubuntu8.10) ... 398s Setting up systemd-sysv (255.4-1ubuntu8.10) ... 398s Setting up libnss-systemd:armhf (255.4-1ubuntu8.10) ... 398s Setting up netplan.io (1.1.2-2~ubuntu24.04.2) ... 398s Setting up libpam-systemd:armhf (255.4-1ubuntu8.10) ... 399s Processing triggers for initramfs-tools (0.142ubuntu25.5) ... 399s Processing triggers for libc-bin (2.39-0ubuntu8.4) ... 399s Processing triggers for ufw (0.36.2-6) ... 399s Processing triggers for man-db (2.12.0-4build2) ... 400s Processing triggers for dbus (1.14.10-4ubuntu4.1) ... 400s Processing triggers for install-info (7.1-3build2) ... 402s Reading package lists... 403s Building dependency tree... 403s Reading state information... 403s Starting pkgProblemResolver with broken count: 0 403s Starting 2 pkgProblemResolver with broken count: 0 403s Done 404s The following packages will be REMOVED: 404s linux-headers-6.8.0-62* linux-headers-6.8.0-62-generic* 404s 0 upgraded, 0 newly installed, 2 to remove and 0 not upgraded. 404s After this operation, 92.5 MB disk space will be freed. 404s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 89198 files and directories currently installed.) 404s Removing linux-headers-6.8.0-62-generic (6.8.0-62.65) ... 404s Removing linux-headers-6.8.0-62 (6.8.0-62.65) ... 407s autopkgtest [23:56:41]: rebooting testbed after setup commands that affected boot 468s Reading package lists... 468s Building dependency tree... 468s Reading state information... 468s Starting pkgProblemResolver with broken count: 0 468s Starting 2 pkgProblemResolver with broken count: 0 468s Done 469s The following NEW packages will be installed: 469s libatomic1 libjemalloc2 liblzf1 redis-sentinel redis-server redis-tools 469s 0 upgraded, 6 newly installed, 0 to remove and 0 not upgraded. 469s Need to get 1225 kB of archives. 469s After this operation, 4616 kB of additional disk space will be used. 469s Get:1 http://ftpmaster.internal/ubuntu noble-updates/main armhf libatomic1 armhf 14.2.0-4ubuntu2~24.04 [7888 B] 469s Get:2 http://ftpmaster.internal/ubuntu noble/universe armhf libjemalloc2 armhf 5.3.0-2build1 [200 kB] 470s Get:3 http://ftpmaster.internal/ubuntu noble/universe armhf liblzf1 armhf 3.6-4 [6554 B] 470s Get:4 http://ftpmaster.internal/ubuntu noble-updates/universe armhf redis-tools armhf 5:7.0.15-1ubuntu0.24.04.1 [946 kB] 470s Get:5 http://ftpmaster.internal/ubuntu noble-updates/universe armhf redis-sentinel armhf 5:7.0.15-1ubuntu0.24.04.1 [12.3 kB] 470s Get:6 http://ftpmaster.internal/ubuntu noble-updates/universe armhf redis-server armhf 5:7.0.15-1ubuntu0.24.04.1 [51.7 kB] 470s Fetched 1225 kB in 1s (1679 kB/s) 470s Selecting previously unselected package libatomic1:armhf. 470s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 58042 files and directories currently installed.) 470s Preparing to unpack .../0-libatomic1_14.2.0-4ubuntu2~24.04_armhf.deb ... 470s Unpacking libatomic1:armhf (14.2.0-4ubuntu2~24.04) ... 470s Selecting previously unselected package libjemalloc2:armhf. 470s Preparing to unpack .../1-libjemalloc2_5.3.0-2build1_armhf.deb ... 470s Unpacking libjemalloc2:armhf (5.3.0-2build1) ... 470s Selecting previously unselected package liblzf1:armhf. 470s Preparing to unpack .../2-liblzf1_3.6-4_armhf.deb ... 470s Unpacking liblzf1:armhf (3.6-4) ... 470s Selecting previously unselected package redis-tools. 470s Preparing to unpack .../3-redis-tools_5%3a7.0.15-1ubuntu0.24.04.1_armhf.deb ... 470s Unpacking redis-tools (5:7.0.15-1ubuntu0.24.04.1) ... 470s Selecting previously unselected package redis-sentinel. 470s Preparing to unpack .../4-redis-sentinel_5%3a7.0.15-1ubuntu0.24.04.1_armhf.deb ... 470s Unpacking redis-sentinel (5:7.0.15-1ubuntu0.24.04.1) ... 470s Selecting previously unselected package redis-server. 470s Preparing to unpack .../5-redis-server_5%3a7.0.15-1ubuntu0.24.04.1_armhf.deb ... 471s Unpacking redis-server (5:7.0.15-1ubuntu0.24.04.1) ... 471s Setting up libjemalloc2:armhf (5.3.0-2build1) ... 471s Setting up liblzf1:armhf (3.6-4) ... 471s Setting up libatomic1:armhf (14.2.0-4ubuntu2~24.04) ... 471s Setting up redis-tools (5:7.0.15-1ubuntu0.24.04.1) ... 471s Setting up redis-server (5:7.0.15-1ubuntu0.24.04.1) ... 471s Created symlink /etc/systemd/system/redis.service → /usr/lib/systemd/system/redis-server.service. 471s Created symlink /etc/systemd/system/multi-user.target.wants/redis-server.service → /usr/lib/systemd/system/redis-server.service. 471s Setting up redis-sentinel (5:7.0.15-1ubuntu0.24.04.1) ... 472s Created symlink /etc/systemd/system/sentinel.service → /usr/lib/systemd/system/redis-sentinel.service. 472s Created symlink /etc/systemd/system/multi-user.target.wants/redis-sentinel.service → /usr/lib/systemd/system/redis-sentinel.service. 472s Processing triggers for man-db (2.12.0-4build2) ... 473s Processing triggers for libc-bin (2.39-0ubuntu8.4) ... 484s autopkgtest [23:57:58]: test 0006-migrate-from-redis: [----------------------- 486s + FLAG_FILE=/etc/valkey/REDIS_MIGRATION 486s + sed -i 's#loglevel notice#loglevel debug#' /etc/redis/redis.conf 486s + systemctl restart redis-server 486s + redis-cli -h 127.0.0.1 -p 6379 SET test 1 486s + redis-cli -h 127.0.0.1 -p 6379 GET test 486s OK 486s 1 486s + redis-cli -h 127.0.0.1 -p 6379 SAVE 486s + sha256sum /var/lib/redis/dump.rdb 486s OK 486s 61b8d7db55347a3b259696fd1fd46e66e0134833f1f87a74151b2cc6a1cafc47 /var/lib/redis/dump.rdb 486s + apt-get install -y valkey-redis-compat 486s Reading package lists... 487s Building dependency tree... 487s Reading state information... 487s The following additional packages will be installed: 487s valkey-server valkey-tools 487s Suggested packages: 487s ruby-redis 487s The following packages will be REMOVED: 487s redis-sentinel redis-server redis-tools 487s The following NEW packages will be installed: 487s valkey-redis-compat valkey-server valkey-tools 487s 0 upgraded, 3 newly installed, 3 to remove and 0 not upgraded. 487s Need to get 1173 kB of archives. 487s After this operation, 457 kB of additional disk space will be used. 487s Get:1 http://ftpmaster.internal/ubuntu noble-updates/universe armhf valkey-tools armhf 7.2.8+dfsg1-0ubuntu0.24.04.2 [1116 kB] 488s Get:2 http://ftpmaster.internal/ubuntu noble-updates/universe armhf valkey-server armhf 7.2.8+dfsg1-0ubuntu0.24.04.2 [49.3 kB] 488s Get:3 http://ftpmaster.internal/ubuntu noble-updates/universe armhf valkey-redis-compat all 7.2.8+dfsg1-0ubuntu0.24.04.2 [7748 B] 488s Fetched 1173 kB in 1s (1785 kB/s) 489s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 58101 files and directories currently installed.) 489s Removing redis-sentinel (5:7.0.15-1ubuntu0.24.04.1) ... 489s Removing redis-server (5:7.0.15-1ubuntu0.24.04.1) ... 490s Removing redis-tools (5:7.0.15-1ubuntu0.24.04.1) ... 490s Selecting previously unselected package valkey-tools. 490s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 58064 files and directories currently installed.) 490s Preparing to unpack .../valkey-tools_7.2.8+dfsg1-0ubuntu0.24.04.2_armhf.deb ... 490s Unpacking valkey-tools (7.2.8+dfsg1-0ubuntu0.24.04.2) ... 490s Selecting previously unselected package valkey-server. 490s Preparing to unpack .../valkey-server_7.2.8+dfsg1-0ubuntu0.24.04.2_armhf.deb ... 490s Unpacking valkey-server (7.2.8+dfsg1-0ubuntu0.24.04.2) ... 490s Selecting previously unselected package valkey-redis-compat. 490s Preparing to unpack .../valkey-redis-compat_7.2.8+dfsg1-0ubuntu0.24.04.2_all.deb ... 490s Unpacking valkey-redis-compat (7.2.8+dfsg1-0ubuntu0.24.04.2) ... 490s Setting up valkey-tools (7.2.8+dfsg1-0ubuntu0.24.04.2) ... 490s Setting up valkey-server (7.2.8+dfsg1-0ubuntu0.24.04.2) ... 491s Created symlink /etc/systemd/system/valkey.service → /usr/lib/systemd/system/valkey-server.service. 491s Created symlink /etc/systemd/system/multi-user.target.wants/valkey-server.service → /usr/lib/systemd/system/valkey-server.service. 491s Setting up valkey-redis-compat (7.2.8+dfsg1-0ubuntu0.24.04.2) ... 491s dpkg-query: no packages found matching valkey-sentinel 491s [I] /etc/redis/redis.conf has been copied to /etc/valkey/valkey.conf. Please, review the content of valkey.conf, especially if you had modified redis.conf. 491s [I] /etc/redis/sentinel.conf has been copied to /etc/valkey/sentinel.conf. Please, review the content of sentinel.conf, especially if you had modified sentinel.conf. 491s [I] On-disk redis dumps moved from /var/lib/redis/ to /var/lib/valkey. 491s Processing triggers for man-db (2.12.0-4build2) ... 492s + '[' -f /etc/valkey/REDIS_MIGRATION ']' 492s + sha256sum /var/lib/valkey/dump.rdb 492s 40e1daad1ff5b64144848f310e91b2119f9c8f7f3147f1cd1cecd4d34946d5e0 /var/lib/valkey/dump.rdb 492s + systemctl status valkey-server 492s + grep inactive 492s Active: inactive (dead) since Thu 2025-07-03 23:58:05 UTC; 634ms ago 492s + rm /etc/valkey/REDIS_MIGRATION 492s + systemctl start valkey-server 492s + systemctl status valkey-server 492s + grep running 492s Active: active (running) since Thu 2025-07-03 23:58:06 UTC; 10ms ago 492s + sha256sum /var/lib/valkey/dump.rdb 492s 40e1daad1ff5b64144848f310e91b2119f9c8f7f3147f1cd1cecd4d34946d5e0 /var/lib/valkey/dump.rdb 492s + cat /etc/valkey/valkey.conf 492s + grep loglevel 492s + grep debug 492s loglevel debug 492s + valkey-cli -h 127.0.0.1 -p 6379 GET test 492s + grep 1 492s 1 492s autopkgtest [23:58:06]: test 0006-migrate-from-redis: -----------------------] 496s 0006-migrate-from-redis PASS 496s autopkgtest [23:58:10]: test 0006-migrate-from-redis: - - - - - - - - - - results - - - - - - - - - - 499s autopkgtest [23:58:13]: @@@@@@@@@@@@@@@@@@@@ summary 499s 0001-valkey-cli PASS 499s 0002-benchmark PASS 499s 0003-valkey-check-aof PASS 499s 0004-valkey-check-rdb PASS 499s 0005-cjson PASS 499s 0006-migrate-from-redis PASS