0s autopkgtest [02:11:48]: starting date and time: 2025-07-04 02:11:48+0000 0s autopkgtest [02:11:48]: git checkout: 508d4a25 a-v-ssh wait_for_ssh: demote "ssh connection failed" to a debug message 0s autopkgtest [02:11:48]: host juju-7f2275-prod-proposed-migration-environment-21; command line: /home/ubuntu/autopkgtest/runner/autopkgtest --output-dir /tmp/autopkgtest-work.gi7tmcmh/out --timeout-copy=6000 --setup-commands /home/ubuntu/autopkgtest-cloud/worker-config-production/setup-canonical.sh --apt-pocket=proposed=src:systemd,src:netplan.io,src:openssh,src:samba --apt-upgrade valkey --timeout-short=300 --timeout-copy=20000 --timeout-build=20000 '--env=ADT_TEST_TRIGGERS=systemd/255.4-1ubuntu8.10 netplan.io/1.1.2-2~ubuntu24.04.2 openssh/1:9.6p1-3ubuntu13.13 samba/2:4.19.5+dfsg-4ubuntu9.2' -- ssh -s /home/ubuntu/autopkgtest/ssh-setup/nova -- --flavor builder-cpu2-ram4-disk20 --security-groups autopkgtest-juju-7f2275-prod-proposed-migration-environment-21@bos03-5.secgroup --name adt-noble-amd64-valkey-20250704-021147-juju-7f2275-prod-proposed-migration-environment-21-30308ce2-47c3-4d53-b097-f9239f946bc0 --image adt/ubuntu-noble-amd64-server --keyname testbed-juju-7f2275-prod-proposed-migration-environment-21 --net-id=net_prod-proposed-migration-amd64 -e TERM=linux --mirror=http://ftpmaster.internal/ubuntu/ 3s Creating nova instance adt-noble-amd64-valkey-20250704-021147-juju-7f2275-prod-proposed-migration-environment-21-30308ce2-47c3-4d53-b097-f9239f946bc0 from image adt/ubuntu-noble-amd64-server-20250703.img (UUID 841a84e8-df42-4fef-9073-9d50a10876b1)... 129s autopkgtest [02:13:57]: testbed dpkg architecture: amd64 129s autopkgtest [02:13:57]: testbed apt version: 2.8.3 129s autopkgtest [02:13:57]: @@@@@@@@@@@@@@@@@@@@ test bed setup 129s autopkgtest [02:13:57]: testbed release detected to be: None 130s autopkgtest [02:13:58]: updating testbed package index (apt update) 130s Get:1 http://ftpmaster.internal/ubuntu noble-proposed InRelease [265 kB] 131s Hit:2 http://ftpmaster.internal/ubuntu noble InRelease 131s Hit:3 http://ftpmaster.internal/ubuntu noble-updates InRelease 131s Hit:4 http://ftpmaster.internal/ubuntu noble-security InRelease 131s Get:5 http://ftpmaster.internal/ubuntu noble-proposed/main Sources [65.3 kB] 131s Get:6 http://ftpmaster.internal/ubuntu noble-proposed/universe Sources [63.8 kB] 131s Get:7 http://ftpmaster.internal/ubuntu noble-proposed/multiverse Sources [3948 B] 131s Get:8 http://ftpmaster.internal/ubuntu noble-proposed/restricted Sources [28.9 kB] 131s Get:9 http://ftpmaster.internal/ubuntu noble-proposed/main i386 Packages [59.3 kB] 131s Get:10 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 Packages [283 kB] 131s Get:11 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 c-n-f Metadata [2248 B] 131s Get:12 http://ftpmaster.internal/ubuntu noble-proposed/restricted amd64 Packages [423 kB] 131s Get:13 http://ftpmaster.internal/ubuntu noble-proposed/restricted i386 Packages [9812 B] 131s Get:14 http://ftpmaster.internal/ubuntu noble-proposed/restricted amd64 c-n-f Metadata [116 B] 131s Get:15 http://ftpmaster.internal/ubuntu noble-proposed/universe amd64 Packages [452 kB] 131s Get:16 http://ftpmaster.internal/ubuntu noble-proposed/universe i386 Packages [348 kB] 131s Get:17 http://ftpmaster.internal/ubuntu noble-proposed/universe amd64 c-n-f Metadata [7448 B] 131s Get:18 http://ftpmaster.internal/ubuntu noble-proposed/multiverse i386 Packages [752 B] 131s Get:19 http://ftpmaster.internal/ubuntu noble-proposed/multiverse amd64 Packages [5264 B] 131s Get:20 http://ftpmaster.internal/ubuntu noble-proposed/multiverse amd64 c-n-f Metadata [116 B] 135s Fetched 2018 kB in 1s (1847 kB/s) 136s Reading package lists... 136s autopkgtest [02:14:04]: upgrading testbed (apt dist-upgrade and autopurge) 137s Reading package lists... 137s Building dependency tree... 137s Reading state information... 137s Calculating upgrade...Starting pkgProblemResolver with broken count: 0 137s Starting 2 pkgProblemResolver with broken count: 0 137s Done 138s Entering ResolveByKeep 138s 138s The following packages will be upgraded: 138s gzip libnetplan1 libnss-systemd libpam-systemd libsystemd-shared libsystemd0 138s libudev1 netplan-generator netplan.io openssh-client openssh-server 138s openssh-sftp-server python3-netplan systemd systemd-dev systemd-resolved 138s systemd-sysv systemd-timesyncd udev 138s 19 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 138s Need to get 10.7 MB of archives. 138s After this operation, 34.8 kB of additional disk space will be used. 138s Get:1 http://ftpmaster.internal/ubuntu noble-updates/main amd64 gzip amd64 1.12-1ubuntu3.1 [99.0 kB] 139s Get:2 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 libnss-systemd amd64 255.4-1ubuntu8.10 [159 kB] 139s Get:3 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 systemd-dev all 255.4-1ubuntu8.10 [105 kB] 139s Get:4 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 systemd-timesyncd amd64 255.4-1ubuntu8.10 [35.3 kB] 139s Get:5 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 systemd-resolved amd64 255.4-1ubuntu8.10 [296 kB] 139s Get:6 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 libsystemd-shared amd64 255.4-1ubuntu8.10 [2074 kB] 139s Get:7 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 libsystemd0 amd64 255.4-1ubuntu8.10 [434 kB] 139s Get:8 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 systemd-sysv amd64 255.4-1ubuntu8.10 [11.9 kB] 139s Get:9 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 libpam-systemd amd64 255.4-1ubuntu8.10 [235 kB] 139s Get:10 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 systemd amd64 255.4-1ubuntu8.10 [3475 kB] 140s Get:11 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 udev amd64 255.4-1ubuntu8.10 [1873 kB] 140s Get:12 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 libudev1 amd64 255.4-1ubuntu8.10 [176 kB] 140s Get:13 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 openssh-sftp-server amd64 1:9.6p1-3ubuntu13.13 [37.1 kB] 140s Get:14 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 openssh-server amd64 1:9.6p1-3ubuntu13.13 [510 kB] 140s Get:15 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 openssh-client amd64 1:9.6p1-3ubuntu13.13 [906 kB] 140s Get:16 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 python3-netplan amd64 1.1.2-2~ubuntu24.04.2 [24.3 kB] 140s Get:17 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 netplan-generator amd64 1.1.2-2~ubuntu24.04.2 [61.1 kB] 140s Get:18 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 netplan.io amd64 1.1.2-2~ubuntu24.04.2 [69.7 kB] 140s Get:19 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 libnetplan1 amd64 1.1.2-2~ubuntu24.04.2 [132 kB] 140s Preconfiguring packages ... 140s Fetched 10.7 MB in 2s (5897 kB/s) 140s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 106307 files and directories currently installed.) 140s Preparing to unpack .../gzip_1.12-1ubuntu3.1_amd64.deb ... 140s Unpacking gzip (1.12-1ubuntu3.1) over (1.12-1ubuntu3) ... 141s Setting up gzip (1.12-1ubuntu3.1) ... 141s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 106307 files and directories currently installed.) 141s Preparing to unpack .../0-libnss-systemd_255.4-1ubuntu8.10_amd64.deb ... 141s Unpacking libnss-systemd:amd64 (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 141s Preparing to unpack .../1-systemd-dev_255.4-1ubuntu8.10_all.deb ... 141s Unpacking systemd-dev (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 141s Preparing to unpack .../2-systemd-timesyncd_255.4-1ubuntu8.10_amd64.deb ... 141s Unpacking systemd-timesyncd (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 141s Preparing to unpack .../3-systemd-resolved_255.4-1ubuntu8.10_amd64.deb ... 141s Unpacking systemd-resolved (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 141s Preparing to unpack .../4-libsystemd-shared_255.4-1ubuntu8.10_amd64.deb ... 141s Unpacking libsystemd-shared:amd64 (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 141s Preparing to unpack .../5-libsystemd0_255.4-1ubuntu8.10_amd64.deb ... 141s Unpacking libsystemd0:amd64 (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 141s Setting up libsystemd0:amd64 (255.4-1ubuntu8.10) ... 141s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 106307 files and directories currently installed.) 141s Preparing to unpack .../systemd-sysv_255.4-1ubuntu8.10_amd64.deb ... 141s Unpacking systemd-sysv (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 141s Preparing to unpack .../libpam-systemd_255.4-1ubuntu8.10_amd64.deb ... 141s Unpacking libpam-systemd:amd64 (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 141s Preparing to unpack .../systemd_255.4-1ubuntu8.10_amd64.deb ... 141s Unpacking systemd (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 141s Preparing to unpack .../udev_255.4-1ubuntu8.10_amd64.deb ... 141s Unpacking udev (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 141s Preparing to unpack .../libudev1_255.4-1ubuntu8.10_amd64.deb ... 141s Unpacking libudev1:amd64 (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 141s Setting up libudev1:amd64 (255.4-1ubuntu8.10) ... 142s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 106307 files and directories currently installed.) 142s Preparing to unpack .../0-openssh-sftp-server_1%3a9.6p1-3ubuntu13.13_amd64.deb ... 142s Unpacking openssh-sftp-server (1:9.6p1-3ubuntu13.13) over (1:9.6p1-3ubuntu13.12) ... 142s Preparing to unpack .../1-openssh-server_1%3a9.6p1-3ubuntu13.13_amd64.deb ... 142s Unpacking openssh-server (1:9.6p1-3ubuntu13.13) over (1:9.6p1-3ubuntu13.12) ... 142s Preparing to unpack .../2-openssh-client_1%3a9.6p1-3ubuntu13.13_amd64.deb ... 142s Unpacking openssh-client (1:9.6p1-3ubuntu13.13) over (1:9.6p1-3ubuntu13.12) ... 142s Preparing to unpack .../3-python3-netplan_1.1.2-2~ubuntu24.04.2_amd64.deb ... 142s Unpacking python3-netplan (1.1.2-2~ubuntu24.04.2) over (1.1.2-2~ubuntu24.04.1) ... 142s Preparing to unpack .../4-netplan-generator_1.1.2-2~ubuntu24.04.2_amd64.deb ... 142s Adding 'diversion of /lib/systemd/system-generators/netplan to /lib/systemd/system-generators/netplan.usr-is-merged by netplan-generator' 142s Unpacking netplan-generator (1.1.2-2~ubuntu24.04.2) over (1.1.2-2~ubuntu24.04.1) ... 142s Preparing to unpack .../5-netplan.io_1.1.2-2~ubuntu24.04.2_amd64.deb ... 142s Unpacking netplan.io (1.1.2-2~ubuntu24.04.2) over (1.1.2-2~ubuntu24.04.1) ... 142s Preparing to unpack .../6-libnetplan1_1.1.2-2~ubuntu24.04.2_amd64.deb ... 142s Unpacking libnetplan1:amd64 (1.1.2-2~ubuntu24.04.2) over (1.1.2-2~ubuntu24.04.1) ... 142s Setting up openssh-client (1:9.6p1-3ubuntu13.13) ... 142s Setting up systemd-dev (255.4-1ubuntu8.10) ... 142s Setting up libnetplan1:amd64 (1.1.2-2~ubuntu24.04.2) ... 142s Setting up libsystemd-shared:amd64 (255.4-1ubuntu8.10) ... 142s Setting up python3-netplan (1.1.2-2~ubuntu24.04.2) ... 142s Setting up openssh-sftp-server (1:9.6p1-3ubuntu13.13) ... 142s Setting up openssh-server (1:9.6p1-3ubuntu13.13) ... 143s Setting up systemd (255.4-1ubuntu8.10) ... 144s Setting up systemd-timesyncd (255.4-1ubuntu8.10) ... 144s Setting up udev (255.4-1ubuntu8.10) ... 145s Setting up netplan-generator (1.1.2-2~ubuntu24.04.2) ... 145s Removing 'diversion of /lib/systemd/system-generators/netplan to /lib/systemd/system-generators/netplan.usr-is-merged by netplan-generator' 145s Setting up systemd-resolved (255.4-1ubuntu8.10) ... 145s Setting up systemd-sysv (255.4-1ubuntu8.10) ... 145s Setting up libnss-systemd:amd64 (255.4-1ubuntu8.10) ... 145s Setting up netplan.io (1.1.2-2~ubuntu24.04.2) ... 145s Setting up libpam-systemd:amd64 (255.4-1ubuntu8.10) ... 145s Processing triggers for initramfs-tools (0.142ubuntu25.5) ... 145s update-initramfs: Generating /boot/initrd.img-6.8.0-63-generic 152s Processing triggers for libc-bin (2.39-0ubuntu8.4) ... 152s Processing triggers for ufw (0.36.2-6) ... 152s Processing triggers for man-db (2.12.0-4build2) ... 153s Processing triggers for dbus (1.14.10-4ubuntu4.1) ... 153s Processing triggers for install-info (7.1-3build2) ... 153s Reading package lists... 154s Building dependency tree... 154s Reading state information... 154s Starting pkgProblemResolver with broken count: 0 154s Starting 2 pkgProblemResolver with broken count: 0 154s Done 154s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 155s autopkgtest [02:14:23]: rebooting testbed after setup commands that affected boot 174s autopkgtest [02:14:42]: testbed running kernel: Linux 6.8.0-63-generic #66-Ubuntu SMP PREEMPT_DYNAMIC Fri Jun 13 20:25:30 UTC 2025 176s autopkgtest [02:14:44]: @@@@@@@@@@@@@@@@@@@@ apt-source valkey 180s Get:1 http://ftpmaster.internal/ubuntu noble-updates/universe valkey 7.2.8+dfsg1-0ubuntu0.24.04.2 (dsc) [2134 B] 180s Get:2 http://ftpmaster.internal/ubuntu noble-updates/universe valkey 7.2.8+dfsg1-0ubuntu0.24.04.2 (tar) [2470 kB] 180s Get:3 http://ftpmaster.internal/ubuntu noble-updates/universe valkey 7.2.8+dfsg1-0ubuntu0.24.04.2 (diff) [18.2 kB] 180s gpgv: Signature made Mon Mar 3 15:59:36 2025 UTC 180s gpgv: using RSA key 38C77D33856973A58762FBFE401EFCBCDA0FF1BD 180s gpgv: Can't check signature: No public key 180s dpkg-source: warning: cannot verify inline signature for ./valkey_7.2.8+dfsg1-0ubuntu0.24.04.2.dsc: no acceptable signature found 180s autopkgtest [02:14:48]: testing package valkey version 7.2.8+dfsg1-0ubuntu0.24.04.2 182s autopkgtest [02:14:50]: build not needed 184s autopkgtest [02:14:52]: test 0001-valkey-cli: preparing testbed 184s Reading package lists... 184s Building dependency tree... 184s Reading state information... 184s Starting pkgProblemResolver with broken count: 0 184s Starting 2 pkgProblemResolver with broken count: 0 184s Done 185s The following NEW packages will be installed: 185s libatomic1 libjemalloc2 liblzf1 valkey-server valkey-tools 185s 0 upgraded, 5 newly installed, 0 to remove and 0 not upgraded. 185s Need to get 1592 kB of archives. 185s After this operation, 7997 kB of additional disk space will be used. 185s Get:1 http://ftpmaster.internal/ubuntu noble-updates/main amd64 libatomic1 amd64 14.2.0-4ubuntu2~24.04 [10.5 kB] 185s Get:2 http://ftpmaster.internal/ubuntu noble/universe amd64 libjemalloc2 amd64 5.3.0-2build1 [256 kB] 185s Get:3 http://ftpmaster.internal/ubuntu noble/universe amd64 liblzf1 amd64 3.6-4 [7624 B] 185s Get:4 http://ftpmaster.internal/ubuntu noble-updates/universe amd64 valkey-tools amd64 7.2.8+dfsg1-0ubuntu0.24.04.2 [1269 kB] 185s Get:5 http://ftpmaster.internal/ubuntu noble-updates/universe amd64 valkey-server amd64 7.2.8+dfsg1-0ubuntu0.24.04.2 [49.3 kB] 186s Fetched 1592 kB in 1s (2300 kB/s) 186s Selecting previously unselected package libatomic1:amd64. 186s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 106307 files and directories currently installed.) 186s Preparing to unpack .../libatomic1_14.2.0-4ubuntu2~24.04_amd64.deb ... 186s Unpacking libatomic1:amd64 (14.2.0-4ubuntu2~24.04) ... 186s Selecting previously unselected package libjemalloc2:amd64. 186s Preparing to unpack .../libjemalloc2_5.3.0-2build1_amd64.deb ... 186s Unpacking libjemalloc2:amd64 (5.3.0-2build1) ... 186s Selecting previously unselected package liblzf1:amd64. 186s Preparing to unpack .../liblzf1_3.6-4_amd64.deb ... 186s Unpacking liblzf1:amd64 (3.6-4) ... 186s Selecting previously unselected package valkey-tools. 186s Preparing to unpack .../valkey-tools_7.2.8+dfsg1-0ubuntu0.24.04.2_amd64.deb ... 186s Unpacking valkey-tools (7.2.8+dfsg1-0ubuntu0.24.04.2) ... 186s Selecting previously unselected package valkey-server. 186s Preparing to unpack .../valkey-server_7.2.8+dfsg1-0ubuntu0.24.04.2_amd64.deb ... 186s Unpacking valkey-server (7.2.8+dfsg1-0ubuntu0.24.04.2) ... 186s Setting up libjemalloc2:amd64 (5.3.0-2build1) ... 186s Setting up liblzf1:amd64 (3.6-4) ... 186s Setting up libatomic1:amd64 (14.2.0-4ubuntu2~24.04) ... 186s Setting up valkey-tools (7.2.8+dfsg1-0ubuntu0.24.04.2) ... 186s Setting up valkey-server (7.2.8+dfsg1-0ubuntu0.24.04.2) ... 187s Created symlink /etc/systemd/system/valkey.service → /usr/lib/systemd/system/valkey-server.service. 187s Created symlink /etc/systemd/system/multi-user.target.wants/valkey-server.service → /usr/lib/systemd/system/valkey-server.service. 187s Processing triggers for man-db (2.12.0-4build2) ... 187s Processing triggers for libc-bin (2.39-0ubuntu8.4) ... 189s autopkgtest [02:14:57]: test 0001-valkey-cli: [----------------------- 194s # Server 194s redis_version:7.2.4 194s server_name:valkey 194s valkey_version:7.2.8 194s redis_git_sha1:00000000 194s redis_git_dirty:0 194s redis_build_id:c153bd6b3f23fc46 194s redis_mode:standalone 194s os:Linux 6.8.0-63-generic x86_64 194s arch_bits:64 194s monotonic_clock:POSIX clock_gettime 194s multiplexing_api:epoll 194s atomicvar_api:c11-builtin 194s gcc_version:13.3.0 194s process_id:1508 194s process_supervised:systemd 194s run_id:0adaef4edc5f7ccec58b598aca6d02ea1ecc45e9 194s tcp_port:6379 194s server_time_usec:1751595302439297 194s uptime_in_seconds:5 194s uptime_in_days:0 194s hz:10 194s configured_hz:10 194s lru_clock:6764838 194s executable:/usr/bin/valkey-server 194s config_file:/etc/valkey/valkey.conf 194s io_threads_active:0 194s listener0:name=tcp,bind=127.0.0.1,bind=-::1,port=6379 194s 194s # Clients 194s connected_clients:1 194s cluster_connections:0 194s maxclients:10000 194s client_recent_max_input_buffer:0 194s client_recent_max_output_buffer:0 194s blocked_clients:0 194s tracking_clients:0 194s clients_in_timeout_table:0 194s total_blocking_keys:0 194s total_blocking_keys_on_nokey:0 194s 194s # Memory 194s used_memory:938776 194s used_memory_human:916.77K 194s used_memory_rss:13893632 194s used_memory_rss_human:13.25M 194s used_memory_peak:938776 194s used_memory_peak_human:916.77K 194s used_memory_peak_perc:102.54% 194s used_memory_overhead:897808 194s used_memory_startup:897608 194s used_memory_dataset:40968 194s used_memory_dataset_perc:99.51% 194s allocator_allocated:1989856 194s allocator_active:2170880 194s allocator_resident:4935680 194s total_system_memory:4106231808 194s total_system_memory_human:3.82G 194s used_memory_lua:31744 194s used_memory_vm_eval:31744 194s used_memory_lua_human:31.00K 194s used_memory_scripts_eval:0 194s number_of_cached_scripts:0 194s number_of_functions:0 194s number_of_libraries:0 194s used_memory_vm_functions:32768 194s used_memory_vm_total:64512 194s used_memory_vm_total_human:63.00K 194s used_memory_functions:200 194s used_memory_scripts:200 194s used_memory_scripts_human:200B 194s maxmemory:0 194s maxmemory_human:0B 194s maxmemory_policy:noeviction 194s allocator_frag_ratio:1.09 194s allocator_frag_bytes:181024 194s allocator_rss_ratio:2.27 194s allocator_rss_bytes:2764800 194s rss_overhead_ratio:2.81 194s rss_overhead_bytes:8957952 194s mem_fragmentation_ratio:15.48 194s mem_fragmentation_bytes:12995880 194s mem_not_counted_for_evict:0 194s mem_replication_backlog:0 194s mem_total_replication_buffers:0 194s mem_clients_slaves:0 194s mem_clients_normal:0 194s mem_cluster_links:0 194s mem_aof_buffer:0 194s mem_allocator:jemalloc-5.3.0 194s active_defrag_running:0 194s lazyfree_pending_objects:0 194s lazyfreed_objects:0 194s 194s # Persistence 194s loading:0 194s async_loading:0 194s current_cow_peak:0 194s current_cow_size:0 194s current_cow_size_age:0 194s current_fork_perc:0.00 194s current_save_keys_processed:0 194s current_save_keys_total:0 194s rdb_changes_since_last_save:0 194s rdb_bgsave_in_progress:0 194s rdb_last_save_time:1751595297 194s rdb_last_bgsave_status:ok 194s rdb_last_bgsave_time_sec:-1 194s rdb_current_bgsave_time_sec:-1 194s rdb_saves:0 194s rdb_last_cow_size:0 194s rdb_last_load_keys_expired:0 194s rdb_last_load_keys_loaded:0 194s aof_enabled:0 194s aof_rewrite_in_progress:0 194s aof_rewrite_scheduled:0 194s aof_last_rewrite_time_sec:-1 194s aof_current_rewrite_time_sec:-1 194s aof_last_bgrewrite_status:ok 194s aof_rewrites:0 194s aof_rewrites_consecutive_failures:0 194s aof_last_write_status:ok 194s aof_last_cow_size:0 194s module_fork_in_progress:0 194s module_fork_last_cow_size:0 194s 194s # Stats 194s total_connections_received:1 194s total_commands_processed:0 194s instantaneous_ops_per_sec:0 194s total_net_input_bytes:14 194s total_net_output_bytes:0 194s total_net_repl_input_bytes:0 194s total_net_repl_output_bytes:0 194s instantaneous_input_kbps:0.00 194s instantaneous_output_kbps:0.00 194s instantaneous_input_repl_kbps:0.00 194s instantaneous_output_repl_kbps:0.00 194s rejected_connections:0 194s sync_full:0 194s sync_partial_ok:0 194s sync_partial_err:0 194s expired_keys:0 194s expired_stale_perc:0.00 194s expired_time_cap_reached_count:0 194s expire_cycle_cpu_milliseconds:0 194s evicted_keys:0 194s evicted_clients:0 194s total_eviction_exceeded_time:0 194s current_eviction_exceeded_time:0 194s keyspace_hits:0 194s keyspace_misses:0 194s pubsub_channels:0 194s pubsub_patterns:0 194s pubsubshard_channels:0 194s latest_fork_usec:0 194s total_forks:0 194s migrate_cached_sockets:0 194s slave_expires_tracked_keys:0 194s active_defrag_hits:0 194s active_defrag_misses:0 194s active_defrag_key_hits:0 194s active_defrag_key_misses:0 194s total_active_defrag_time:0 194s current_active_defrag_time:0 194s tracking_total_keys:0 194s tracking_total_items:0 194s tracking_total_prefixes:0 194s unexpected_error_replies:0 194s total_error_replies:0 194s dump_payload_sanitizations:0 194s total_reads_processed:1 194s total_writes_processed:0 194s io_threaded_reads_processed:0 194s io_threaded_writes_processed:0 194s reply_buffer_shrinks:0 194s reply_buffer_expands:0 194s eventloop_cycles:51 194s eventloop_duration_sum:5772 194s eventloop_duration_cmd_sum:0 194s instantaneous_eventloop_cycles_per_sec:9 194s instantaneous_eventloop_duration_usec:103 194s acl_access_denied_auth:0 194s acl_access_denied_cmd:0 194s acl_access_denied_key:0 194s acl_access_denied_channel:0 194s 194s # Replication 194s role:master 194s connected_slaves:0 194s master_failover_state:no-failover 194s master_replid:968fa1d34ea5bff2a952eb4ae99c4c3d82da40a6 194s master_replid2:0000000000000000000000000000000000000000 194s master_repl_offset:0 194s second_repl_offset:-1 194s repl_backlog_active:0 194s repl_backlog_size:1048576 194s repl_backlog_first_byte_offset:0 194s repl_backlog_histlen:0 194s 194s # CPU 194s used_cpu_sys:0.030641 194s used_cpu_user:0.028541 194s used_cpu_sys_children:0.000000 194s used_cpu_user_children:0.000625 194s used_cpu_sys_main_thread:0.029952 194s used_cpu_user_main_thread:0.028898 194s 194s # Modules 194s 194s # Errorstats 194s 194s # Cluster 194s cluster_enabled:0 194s 194s # Keyspace 194s Redis ver. 7.2.8 194s autopkgtest [02:15:02]: test 0001-valkey-cli: -----------------------] 195s autopkgtest [02:15:03]: test 0001-valkey-cli: - - - - - - - - - - results - - - - - - - - - - 195s 0001-valkey-cli PASS 195s autopkgtest [02:15:03]: test 0002-benchmark: preparing testbed 195s Reading package lists... 195s Building dependency tree... 195s Reading state information... 196s Starting pkgProblemResolver with broken count: 0 196s Starting 2 pkgProblemResolver with broken count: 0 196s Done 196s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 197s autopkgtest [02:15:05]: test 0002-benchmark: [----------------------- 202s PING_INLINE: rps=0.0 (overall: 0.0) avg_msec=-nan (overall: -nan) ====== PING_INLINE ====== 202s 100000 requests completed in 0.17 seconds 202s 50 parallel clients 202s 3 bytes payload 202s keep alive: 1 202s host configuration "save": 3600 1 300 100 60 10000 202s host configuration "appendonly": no 202s multi-thread: no 202s 202s Latency by percentile distribution: 202s 0.000% <= 0.151 milliseconds (cumulative count 10) 202s 50.000% <= 0.423 milliseconds (cumulative count 53410) 202s 75.000% <= 0.479 milliseconds (cumulative count 76500) 202s 87.500% <= 0.527 milliseconds (cumulative count 87780) 202s 93.750% <= 0.599 milliseconds (cumulative count 94110) 202s 96.875% <= 0.655 milliseconds (cumulative count 96960) 202s 98.438% <= 0.719 milliseconds (cumulative count 98500) 202s 99.219% <= 0.895 milliseconds (cumulative count 99220) 202s 99.609% <= 1.455 milliseconds (cumulative count 99610) 202s 99.805% <= 1.847 milliseconds (cumulative count 99810) 202s 99.902% <= 2.111 milliseconds (cumulative count 99910) 202s 99.951% <= 2.215 milliseconds (cumulative count 99960) 202s 99.976% <= 2.279 milliseconds (cumulative count 99980) 202s 99.988% <= 2.303 milliseconds (cumulative count 99990) 202s 99.994% <= 2.335 milliseconds (cumulative count 100000) 202s 100.000% <= 2.335 milliseconds (cumulative count 100000) 202s 202s Cumulative distribution of latencies: 202s 0.000% <= 0.103 milliseconds (cumulative count 0) 202s 0.050% <= 0.207 milliseconds (cumulative count 50) 202s 0.150% <= 0.303 milliseconds (cumulative count 150) 202s 35.760% <= 0.407 milliseconds (cumulative count 35760) 202s 84.190% <= 0.503 milliseconds (cumulative count 84190) 202s 94.520% <= 0.607 milliseconds (cumulative count 94520) 202s 98.300% <= 0.703 milliseconds (cumulative count 98300) 202s 98.950% <= 0.807 milliseconds (cumulative count 98950) 202s 99.240% <= 0.903 milliseconds (cumulative count 99240) 202s 99.460% <= 1.007 milliseconds (cumulative count 99460) 202s 99.500% <= 1.103 milliseconds (cumulative count 99500) 202s 99.520% <= 1.303 milliseconds (cumulative count 99520) 202s 99.580% <= 1.407 milliseconds (cumulative count 99580) 202s 99.650% <= 1.503 milliseconds (cumulative count 99650) 202s 99.700% <= 1.607 milliseconds (cumulative count 99700) 202s 99.740% <= 1.703 milliseconds (cumulative count 99740) 202s 99.790% <= 1.807 milliseconds (cumulative count 99790) 202s 99.830% <= 1.903 milliseconds (cumulative count 99830) 202s 99.860% <= 2.007 milliseconds (cumulative count 99860) 202s 99.900% <= 2.103 milliseconds (cumulative count 99900) 202s 100.000% <= 3.103 milliseconds (cumulative count 100000) 202s 202s Summary: 202s throughput summary: 588235.31 requests per second 202s latency summary (msec): 202s avg min p50 p95 p99 max 202s 0.455 0.144 0.423 0.623 0.823 2.335 203s PING_MBULK: rps=186320.0 (overall: 589620.2) avg_msec=0.447 (overall: 0.447) ====== PING_MBULK ====== 203s 100000 requests completed in 0.17 seconds 203s 50 parallel clients 203s 3 bytes payload 203s keep alive: 1 203s host configuration "save": 3600 1 300 100 60 10000 203s host configuration "appendonly": no 203s multi-thread: no 203s 203s Latency by percentile distribution: 203s 0.000% <= 0.191 milliseconds (cumulative count 10) 203s 50.000% <= 0.431 milliseconds (cumulative count 54470) 203s 75.000% <= 0.479 milliseconds (cumulative count 75870) 203s 87.500% <= 0.519 milliseconds (cumulative count 87760) 203s 93.750% <= 0.567 milliseconds (cumulative count 93990) 203s 96.875% <= 0.623 milliseconds (cumulative count 96900) 203s 98.438% <= 0.679 milliseconds (cumulative count 98470) 203s 99.219% <= 0.735 milliseconds (cumulative count 99250) 203s 99.609% <= 0.855 milliseconds (cumulative count 99660) 203s 99.805% <= 0.895 milliseconds (cumulative count 99810) 203s 99.902% <= 0.927 milliseconds (cumulative count 99910) 203s 99.951% <= 0.935 milliseconds (cumulative count 99970) 203s 99.976% <= 0.943 milliseconds (cumulative count 100000) 203s 100.000% <= 0.943 milliseconds (cumulative count 100000) 203s 203s Cumulative distribution of latencies: 203s 0.000% <= 0.103 milliseconds (cumulative count 0) 203s 0.020% <= 0.207 milliseconds (cumulative count 20) 203s 0.170% <= 0.303 milliseconds (cumulative count 170) 203s 27.130% <= 0.407 milliseconds (cumulative count 27130) 203s 84.110% <= 0.503 milliseconds (cumulative count 84110) 203s 96.100% <= 0.607 milliseconds (cumulative count 96100) 203s 98.860% <= 0.703 milliseconds (cumulative count 98860) 203s 99.490% <= 0.807 milliseconds (cumulative count 99490) 203s 99.830% <= 0.903 milliseconds (cumulative count 99830) 203s 100.000% <= 1.007 milliseconds (cumulative count 100000) 203s 203s Summary: 203s throughput summary: 584795.31 requests per second 203s latency summary (msec): 203s avg min p50 p95 p99 max 203s 0.451 0.184 0.431 0.591 0.719 0.943 203s SET: rps=387768.9 (overall: 619936.3) avg_msec=0.456 (overall: 0.456) ====== SET ====== 203s 100000 requests completed in 0.16 seconds 203s 50 parallel clients 203s 3 bytes payload 203s keep alive: 1 203s host configuration "save": 3600 1 300 100 60 10000 203s host configuration "appendonly": no 203s multi-thread: no 203s 203s Latency by percentile distribution: 203s 0.000% <= 0.143 milliseconds (cumulative count 10) 203s 50.000% <= 0.431 milliseconds (cumulative count 53810) 203s 75.000% <= 0.487 milliseconds (cumulative count 76530) 203s 87.500% <= 0.543 milliseconds (cumulative count 88390) 203s 93.750% <= 0.623 milliseconds (cumulative count 94200) 203s 96.875% <= 0.695 milliseconds (cumulative count 96930) 203s 98.438% <= 0.791 milliseconds (cumulative count 98490) 203s 99.219% <= 0.871 milliseconds (cumulative count 99240) 203s 99.609% <= 0.967 milliseconds (cumulative count 99610) 203s 99.805% <= 1.127 milliseconds (cumulative count 99810) 203s 99.902% <= 1.399 milliseconds (cumulative count 99910) 203s 99.951% <= 1.567 milliseconds (cumulative count 99960) 203s 99.976% <= 1.615 milliseconds (cumulative count 99980) 203s 99.988% <= 1.631 milliseconds (cumulative count 99990) 203s 99.994% <= 1.727 milliseconds (cumulative count 100000) 203s 100.000% <= 1.727 milliseconds (cumulative count 100000) 203s 203s Cumulative distribution of latencies: 203s 0.000% <= 0.103 milliseconds (cumulative count 0) 203s 0.060% <= 0.207 milliseconds (cumulative count 60) 203s 0.240% <= 0.303 milliseconds (cumulative count 240) 203s 35.100% <= 0.407 milliseconds (cumulative count 35100) 203s 81.730% <= 0.503 milliseconds (cumulative count 81730) 203s 93.260% <= 0.607 milliseconds (cumulative count 93260) 203s 97.100% <= 0.703 milliseconds (cumulative count 97100) 203s 98.610% <= 0.807 milliseconds (cumulative count 98610) 203s 99.440% <= 0.903 milliseconds (cumulative count 99440) 203s 99.710% <= 1.007 milliseconds (cumulative count 99710) 203s 99.780% <= 1.103 milliseconds (cumulative count 99780) 203s 99.840% <= 1.207 milliseconds (cumulative count 99840) 203s 99.870% <= 1.303 milliseconds (cumulative count 99870) 203s 99.910% <= 1.407 milliseconds (cumulative count 99910) 203s 99.930% <= 1.503 milliseconds (cumulative count 99930) 203s 99.970% <= 1.607 milliseconds (cumulative count 99970) 203s 99.990% <= 1.703 milliseconds (cumulative count 99990) 203s 100.000% <= 1.807 milliseconds (cumulative count 100000) 203s 203s Summary: 203s throughput summary: 621118.00 requests per second 203s latency summary (msec): 203s avg min p50 p95 p99 max 203s 0.457 0.136 0.431 0.647 0.847 1.727 203s ====== GET ====== 203s 100000 requests completed in 0.16 seconds 203s 50 parallel clients 203s 3 bytes payload 203s keep alive: 1 203s host configuration "save": 3600 1 300 100 60 10000 203s host configuration "appendonly": no 203s multi-thread: no 203s 203s Latency by percentile distribution: 203s 0.000% <= 0.159 milliseconds (cumulative count 10) 203s 50.000% <= 0.415 milliseconds (cumulative count 55300) 203s 75.000% <= 0.455 milliseconds (cumulative count 76250) 203s 87.500% <= 0.479 milliseconds (cumulative count 87500) 203s 93.750% <= 0.511 milliseconds (cumulative count 94470) 203s 96.875% <= 0.551 milliseconds (cumulative count 97030) 203s 98.438% <= 0.647 milliseconds (cumulative count 98470) 203s 99.219% <= 0.783 milliseconds (cumulative count 99220) 203s 99.609% <= 1.223 milliseconds (cumulative count 99610) 203s 99.805% <= 1.599 milliseconds (cumulative count 99810) 203s 99.902% <= 1.863 milliseconds (cumulative count 99910) 203s 99.951% <= 1.983 milliseconds (cumulative count 99960) 203s 99.976% <= 2.039 milliseconds (cumulative count 99980) 203s 99.988% <= 2.055 milliseconds (cumulative count 99990) 203s 99.994% <= 2.095 milliseconds (cumulative count 100000) 203s 100.000% <= 2.095 milliseconds (cumulative count 100000) 203s 203s Cumulative distribution of latencies: 203s 0.000% <= 0.103 milliseconds (cumulative count 0) 203s 0.060% <= 0.207 milliseconds (cumulative count 60) 203s 0.200% <= 0.303 milliseconds (cumulative count 200) 203s 46.990% <= 0.407 milliseconds (cumulative count 46990) 203s 93.600% <= 0.503 milliseconds (cumulative count 93600) 203s 98.080% <= 0.607 milliseconds (cumulative count 98080) 203s 98.830% <= 0.703 milliseconds (cumulative count 98830) 203s 99.270% <= 0.807 milliseconds (cumulative count 99270) 203s 99.440% <= 0.903 milliseconds (cumulative count 99440) 203s 99.500% <= 1.007 milliseconds (cumulative count 99500) 203s 99.540% <= 1.103 milliseconds (cumulative count 99540) 203s 99.600% <= 1.207 milliseconds (cumulative count 99600) 203s 99.660% <= 1.303 milliseconds (cumulative count 99660) 203s 99.710% <= 1.407 milliseconds (cumulative count 99710) 203s 99.760% <= 1.503 milliseconds (cumulative count 99760) 203s 99.810% <= 1.607 milliseconds (cumulative count 99810) 203s 99.840% <= 1.703 milliseconds (cumulative count 99840) 203s 99.880% <= 1.807 milliseconds (cumulative count 99880) 203s 99.930% <= 1.903 milliseconds (cumulative count 99930) 203s 99.970% <= 2.007 milliseconds (cumulative count 99970) 203s 100.000% <= 2.103 milliseconds (cumulative count 100000) 203s 203s Summary: 203s throughput summary: 621118.00 requests per second 203s latency summary (msec): 203s avg min p50 p95 p99 max 203s 0.431 0.152 0.415 0.519 0.735 2.095 203s INCR: rps=196320.0 (overall: 613500.0) avg_msec=0.443 (overall: 0.443) ====== INCR ====== 203s 100000 requests completed in 0.16 seconds 203s 50 parallel clients 203s 3 bytes payload 203s keep alive: 1 203s host configuration "save": 3600 1 300 100 60 10000 203s host configuration "appendonly": no 203s multi-thread: no 203s 203s Latency by percentile distribution: 203s 0.000% <= 0.135 milliseconds (cumulative count 10) 203s 50.000% <= 0.415 milliseconds (cumulative count 50440) 203s 75.000% <= 0.471 milliseconds (cumulative count 78290) 203s 87.500% <= 0.503 milliseconds (cumulative count 88630) 203s 93.750% <= 0.551 milliseconds (cumulative count 94270) 203s 96.875% <= 0.599 milliseconds (cumulative count 97230) 203s 98.438% <= 0.655 milliseconds (cumulative count 98460) 203s 99.219% <= 0.703 milliseconds (cumulative count 99270) 203s 99.609% <= 0.735 milliseconds (cumulative count 99640) 203s 99.805% <= 0.783 milliseconds (cumulative count 99840) 203s 99.902% <= 0.839 milliseconds (cumulative count 99930) 203s 99.951% <= 0.871 milliseconds (cumulative count 99960) 203s 99.976% <= 0.887 milliseconds (cumulative count 99980) 203s 99.988% <= 0.903 milliseconds (cumulative count 99990) 203s 99.994% <= 0.927 milliseconds (cumulative count 100000) 203s 100.000% <= 0.927 milliseconds (cumulative count 100000) 203s 203s Cumulative distribution of latencies: 203s 0.000% <= 0.103 milliseconds (cumulative count 0) 203s 0.070% <= 0.207 milliseconds (cumulative count 70) 203s 0.150% <= 0.303 milliseconds (cumulative count 150) 203s 42.180% <= 0.407 milliseconds (cumulative count 42180) 203s 88.630% <= 0.503 milliseconds (cumulative count 88630) 203s 97.530% <= 0.607 milliseconds (cumulative count 97530) 203s 99.270% <= 0.703 milliseconds (cumulative count 99270) 203s 99.890% <= 0.807 milliseconds (cumulative count 99890) 203s 99.990% <= 0.903 milliseconds (cumulative count 99990) 203s 100.000% <= 1.007 milliseconds (cumulative count 100000) 203s 203s Summary: 203s throughput summary: 613496.94 requests per second 203s latency summary (msec): 203s avg min p50 p95 p99 max 203s 0.436 0.128 0.415 0.567 0.687 0.927 203s ====== LPUSH ====== 203s 100000 requests completed in 0.16 seconds 203s 50 parallel clients 203s 3 bytes payload 203s keep alive: 1 203s host configuration "save": 3600 1 300 100 60 10000 203s host configuration "appendonly": no 203s multi-thread: no 203s 203s Latency by percentile distribution: 203s 0.000% <= 0.167 milliseconds (cumulative count 10) 203s 50.000% <= 0.447 milliseconds (cumulative count 52750) 203s 75.000% <= 0.519 milliseconds (cumulative count 76720) 203s 87.500% <= 0.583 milliseconds (cumulative count 87750) 203s 93.750% <= 0.663 milliseconds (cumulative count 93890) 203s 96.875% <= 0.743 milliseconds (cumulative count 97060) 203s 98.438% <= 0.807 milliseconds (cumulative count 98510) 203s 99.219% <= 0.871 milliseconds (cumulative count 99300) 203s 99.609% <= 1.087 milliseconds (cumulative count 99610) 203s 99.805% <= 1.447 milliseconds (cumulative count 99810) 203s 99.902% <= 1.695 milliseconds (cumulative count 99910) 203s 99.951% <= 1.831 milliseconds (cumulative count 99960) 203s 99.976% <= 1.879 milliseconds (cumulative count 99980) 203s 99.988% <= 1.895 milliseconds (cumulative count 99990) 203s 99.994% <= 1.927 milliseconds (cumulative count 100000) 203s 100.000% <= 1.927 milliseconds (cumulative count 100000) 203s 203s Cumulative distribution of latencies: 203s 0.000% <= 0.103 milliseconds (cumulative count 0) 203s 0.030% <= 0.207 milliseconds (cumulative count 30) 203s 0.280% <= 0.303 milliseconds (cumulative count 280) 203s 30.730% <= 0.407 milliseconds (cumulative count 30730) 203s 72.530% <= 0.503 milliseconds (cumulative count 72530) 203s 90.040% <= 0.607 milliseconds (cumulative count 90040) 203s 95.750% <= 0.703 milliseconds (cumulative count 95750) 203s 98.510% <= 0.807 milliseconds (cumulative count 98510) 203s 99.420% <= 0.903 milliseconds (cumulative count 99420) 203s 99.550% <= 1.007 milliseconds (cumulative count 99550) 203s 99.610% <= 1.103 milliseconds (cumulative count 99610) 203s 99.680% <= 1.207 milliseconds (cumulative count 99680) 203s 99.720% <= 1.303 milliseconds (cumulative count 99720) 203s 99.790% <= 1.407 milliseconds (cumulative count 99790) 203s 99.830% <= 1.503 milliseconds (cumulative count 99830) 203s 99.880% <= 1.607 milliseconds (cumulative count 99880) 203s 99.910% <= 1.703 milliseconds (cumulative count 99910) 203s 99.950% <= 1.807 milliseconds (cumulative count 99950) 203s 99.990% <= 1.903 milliseconds (cumulative count 99990) 203s 100.000% <= 2.007 milliseconds (cumulative count 100000) 203s 203s Summary: 203s throughput summary: 636942.62 requests per second 203s latency summary (msec): 203s avg min p50 p95 p99 max 203s 0.473 0.160 0.447 0.687 0.847 1.927 203s RPUSH: rps=15537.8 (overall: 557142.8) avg_msec=0.623 (overall: 0.623) ====== RPUSH ====== 203s 100000 requests completed in 0.16 seconds 203s 50 parallel clients 203s 3 bytes payload 203s keep alive: 1 203s host configuration "save": 3600 1 300 100 60 10000 203s host configuration "appendonly": no 203s multi-thread: no 203s 203s Latency by percentile distribution: 203s 0.000% <= 0.279 milliseconds (cumulative count 20) 203s 50.000% <= 0.431 milliseconds (cumulative count 51820) 203s 75.000% <= 0.479 milliseconds (cumulative count 75470) 203s 87.500% <= 0.527 milliseconds (cumulative count 88180) 203s 93.750% <= 0.615 milliseconds (cumulative count 93780) 203s 96.875% <= 0.719 milliseconds (cumulative count 96920) 203s 98.438% <= 0.807 milliseconds (cumulative count 98540) 203s 99.219% <= 0.871 milliseconds (cumulative count 99250) 203s 99.609% <= 0.935 milliseconds (cumulative count 99610) 203s 99.805% <= 0.999 milliseconds (cumulative count 99810) 203s 99.902% <= 1.079 milliseconds (cumulative count 99910) 203s 99.951% <= 1.159 milliseconds (cumulative count 99960) 203s 99.976% <= 1.191 milliseconds (cumulative count 99980) 203s 99.988% <= 1.199 milliseconds (cumulative count 99990) 203s 99.994% <= 1.215 milliseconds (cumulative count 100000) 203s 100.000% <= 1.215 milliseconds (cumulative count 100000) 203s 203s Cumulative distribution of latencies: 203s 0.000% <= 0.103 milliseconds (cumulative count 0) 203s 0.080% <= 0.303 milliseconds (cumulative count 80) 203s 35.290% <= 0.407 milliseconds (cumulative count 35290) 203s 84.100% <= 0.503 milliseconds (cumulative count 84100) 203s 93.370% <= 0.607 milliseconds (cumulative count 93370) 203s 96.540% <= 0.703 milliseconds (cumulative count 96540) 203s 98.540% <= 0.807 milliseconds (cumulative count 98540) 203s 99.450% <= 0.903 milliseconds (cumulative count 99450) 203s 99.840% <= 1.007 milliseconds (cumulative count 99840) 203s 99.930% <= 1.103 milliseconds (cumulative count 99930) 203s 99.990% <= 1.207 milliseconds (cumulative count 99990) 203s 100.000% <= 1.303 milliseconds (cumulative count 100000) 203s 203s Summary: 203s throughput summary: 621118.00 requests per second 203s latency summary (msec): 203s avg min p50 p95 p99 max 203s 0.455 0.272 0.431 0.655 0.847 1.215 204s LPOP: rps=234320.0 (overall: 623191.5) avg_msec=0.566 (overall: 0.566) ====== LPOP ====== 204s 100000 requests completed in 0.16 seconds 204s 50 parallel clients 204s 3 bytes payload 204s keep alive: 1 204s host configuration "save": 3600 1 300 100 60 10000 204s host configuration "appendonly": no 204s multi-thread: no 204s 204s Latency by percentile distribution: 204s 0.000% <= 0.247 milliseconds (cumulative count 10) 204s 50.000% <= 0.527 milliseconds (cumulative count 50120) 204s 75.000% <= 0.655 milliseconds (cumulative count 75050) 204s 87.500% <= 0.767 milliseconds (cumulative count 87990) 204s 93.750% <= 0.855 milliseconds (cumulative count 94130) 204s 96.875% <= 0.911 milliseconds (cumulative count 97010) 204s 98.438% <= 0.951 milliseconds (cumulative count 98470) 204s 99.219% <= 0.999 milliseconds (cumulative count 99270) 204s 99.609% <= 1.055 milliseconds (cumulative count 99630) 204s 99.805% <= 1.103 milliseconds (cumulative count 99820) 204s 99.902% <= 1.167 milliseconds (cumulative count 99930) 204s 99.951% <= 1.199 milliseconds (cumulative count 99960) 204s 99.976% <= 1.239 milliseconds (cumulative count 99980) 204s 99.988% <= 1.247 milliseconds (cumulative count 99990) 204s 99.994% <= 1.263 milliseconds (cumulative count 100000) 204s 100.000% <= 1.263 milliseconds (cumulative count 100000) 204s 204s Cumulative distribution of latencies: 204s 0.000% <= 0.103 milliseconds (cumulative count 0) 204s 0.100% <= 0.303 milliseconds (cumulative count 100) 204s 13.360% <= 0.407 milliseconds (cumulative count 13360) 204s 44.070% <= 0.503 milliseconds (cumulative count 44070) 204s 66.850% <= 0.607 milliseconds (cumulative count 66850) 204s 81.520% <= 0.703 milliseconds (cumulative count 81520) 204s 91.060% <= 0.807 milliseconds (cumulative count 91060) 204s 96.570% <= 0.903 milliseconds (cumulative count 96570) 204s 99.320% <= 1.007 milliseconds (cumulative count 99320) 204s 99.820% <= 1.103 milliseconds (cumulative count 99820) 204s 99.970% <= 1.207 milliseconds (cumulative count 99970) 204s 100.000% <= 1.303 milliseconds (cumulative count 100000) 204s 204s Summary: 204s throughput summary: 628930.81 requests per second 204s latency summary (msec): 204s avg min p50 p95 p99 max 204s 0.564 0.240 0.527 0.879 0.983 1.263 204s ====== RPOP ====== 204s 100000 requests completed in 0.16 seconds 204s 50 parallel clients 204s 3 bytes payload 204s keep alive: 1 204s host configuration "save": 3600 1 300 100 60 10000 204s host configuration "appendonly": no 204s multi-thread: no 204s 204s Latency by percentile distribution: 204s 0.000% <= 0.159 milliseconds (cumulative count 10) 204s 50.000% <= 0.447 milliseconds (cumulative count 54030) 204s 75.000% <= 0.495 milliseconds (cumulative count 75150) 204s 87.500% <= 0.551 milliseconds (cumulative count 88420) 204s 93.750% <= 0.599 milliseconds (cumulative count 93880) 204s 96.875% <= 0.655 milliseconds (cumulative count 97000) 204s 98.438% <= 0.727 milliseconds (cumulative count 98510) 204s 99.219% <= 0.791 milliseconds (cumulative count 99230) 204s 99.609% <= 0.855 milliseconds (cumulative count 99620) 204s 99.805% <= 0.895 milliseconds (cumulative count 99810) 204s 99.902% <= 0.927 milliseconds (cumulative count 99910) 204s 99.951% <= 0.991 milliseconds (cumulative count 99960) 204s 99.976% <= 1.007 milliseconds (cumulative count 99980) 204s 99.988% <= 1.015 milliseconds (cumulative count 99990) 204s 99.994% <= 1.023 milliseconds (cumulative count 100000) 204s 100.000% <= 1.023 milliseconds (cumulative count 100000) 204s 204s Cumulative distribution of latencies: 204s 0.000% <= 0.103 milliseconds (cumulative count 0) 204s 0.020% <= 0.207 milliseconds (cumulative count 20) 204s 0.150% <= 0.303 milliseconds (cumulative count 150) 204s 30.260% <= 0.407 milliseconds (cumulative count 30260) 204s 77.760% <= 0.503 milliseconds (cumulative count 77760) 204s 94.440% <= 0.607 milliseconds (cumulative count 94440) 204s 98.150% <= 0.703 milliseconds (cumulative count 98150) 204s 99.360% <= 0.807 milliseconds (cumulative count 99360) 204s 99.850% <= 0.903 milliseconds (cumulative count 99850) 204s 99.980% <= 1.007 milliseconds (cumulative count 99980) 204s 100.000% <= 1.103 milliseconds (cumulative count 100000) 204s 204s Summary: 204s throughput summary: 636942.62 requests per second 204s latency summary (msec): 204s avg min p50 p95 p99 max 204s 0.459 0.152 0.447 0.615 0.775 1.023 204s SADD: rps=59681.3 (overall: 599200.0) avg_msec=0.435 (overall: 0.435) ====== SADD ====== 204s 100000 requests completed in 0.16 seconds 204s 50 parallel clients 204s 3 bytes payload 204s keep alive: 1 204s host configuration "save": 3600 1 300 100 60 10000 204s host configuration "appendonly": no 204s multi-thread: no 204s 204s Latency by percentile distribution: 204s 0.000% <= 0.143 milliseconds (cumulative count 10) 204s 50.000% <= 0.407 milliseconds (cumulative count 50480) 204s 75.000% <= 0.455 milliseconds (cumulative count 78970) 204s 87.500% <= 0.479 milliseconds (cumulative count 89230) 204s 93.750% <= 0.503 milliseconds (cumulative count 93910) 204s 96.875% <= 0.559 milliseconds (cumulative count 96890) 204s 98.438% <= 0.631 milliseconds (cumulative count 98440) 204s 99.219% <= 0.719 milliseconds (cumulative count 99320) 204s 99.609% <= 0.767 milliseconds (cumulative count 99620) 204s 99.805% <= 0.839 milliseconds (cumulative count 99810) 204s 99.902% <= 0.919 milliseconds (cumulative count 99910) 204s 99.951% <= 0.975 milliseconds (cumulative count 99960) 204s 99.976% <= 1.007 milliseconds (cumulative count 99980) 204s 99.988% <= 1.031 milliseconds (cumulative count 99990) 204s 99.994% <= 1.167 milliseconds (cumulative count 100000) 204s 100.000% <= 1.167 milliseconds (cumulative count 100000) 204s 204s Cumulative distribution of latencies: 204s 0.000% <= 0.103 milliseconds (cumulative count 0) 204s 0.060% <= 0.207 milliseconds (cumulative count 60) 204s 0.200% <= 0.303 milliseconds (cumulative count 200) 204s 50.480% <= 0.407 milliseconds (cumulative count 50480) 204s 93.910% <= 0.503 milliseconds (cumulative count 93910) 204s 97.990% <= 0.607 milliseconds (cumulative count 97990) 204s 99.150% <= 0.703 milliseconds (cumulative count 99150) 204s 99.740% <= 0.807 milliseconds (cumulative count 99740) 204s 99.900% <= 0.903 milliseconds (cumulative count 99900) 204s 99.980% <= 1.007 milliseconds (cumulative count 99980) 204s 99.990% <= 1.103 milliseconds (cumulative count 99990) 204s 100.000% <= 1.207 milliseconds (cumulative count 100000) 204s 204s Summary: 204s throughput summary: 628930.81 requests per second 204s latency summary (msec): 204s avg min p50 p95 p99 max 204s 0.424 0.136 0.407 0.519 0.695 1.167 204s HSET: rps=294360.0 (overall: 645526.3) avg_msec=0.471 (overall: 0.471) ====== HSET ====== 204s 100000 requests completed in 0.16 seconds 204s 50 parallel clients 204s 3 bytes payload 204s keep alive: 1 204s host configuration "save": 3600 1 300 100 60 10000 204s host configuration "appendonly": no 204s multi-thread: no 204s 204s Latency by percentile distribution: 204s 0.000% <= 0.135 milliseconds (cumulative count 10) 204s 50.000% <= 0.431 milliseconds (cumulative count 52450) 204s 75.000% <= 0.487 milliseconds (cumulative count 76050) 204s 87.500% <= 0.559 milliseconds (cumulative count 87790) 204s 93.750% <= 0.671 milliseconds (cumulative count 93970) 204s 96.875% <= 0.767 milliseconds (cumulative count 97030) 204s 98.438% <= 0.831 milliseconds (cumulative count 98460) 204s 99.219% <= 0.879 milliseconds (cumulative count 99240) 204s 99.609% <= 0.911 milliseconds (cumulative count 99620) 204s 99.805% <= 0.943 milliseconds (cumulative count 99830) 204s 99.902% <= 0.975 milliseconds (cumulative count 99910) 204s 99.951% <= 1.007 milliseconds (cumulative count 99960) 204s 99.976% <= 1.071 milliseconds (cumulative count 99980) 204s 99.988% <= 1.087 milliseconds (cumulative count 99990) 204s 99.994% <= 1.111 milliseconds (cumulative count 100000) 204s 100.000% <= 1.111 milliseconds (cumulative count 100000) 204s 204s Cumulative distribution of latencies: 204s 0.000% <= 0.103 milliseconds (cumulative count 0) 204s 0.070% <= 0.207 milliseconds (cumulative count 70) 204s 0.230% <= 0.303 milliseconds (cumulative count 230) 204s 36.370% <= 0.407 milliseconds (cumulative count 36370) 204s 80.160% <= 0.503 milliseconds (cumulative count 80160) 204s 91.090% <= 0.607 milliseconds (cumulative count 91090) 204s 95.080% <= 0.703 milliseconds (cumulative count 95080) 204s 97.960% <= 0.807 milliseconds (cumulative count 97960) 204s 99.520% <= 0.903 milliseconds (cumulative count 99520) 204s 99.960% <= 1.007 milliseconds (cumulative count 99960) 204s 99.990% <= 1.103 milliseconds (cumulative count 99990) 204s 100.000% <= 1.207 milliseconds (cumulative count 100000) 204s 204s Summary: 204s throughput summary: 641025.62 requests per second 204s latency summary (msec): 204s avg min p50 p95 p99 max 204s 0.460 0.128 0.431 0.703 0.863 1.111 204s ====== SPOP ====== 204s 100000 requests completed in 0.16 seconds 204s 50 parallel clients 204s 3 bytes payload 204s keep alive: 1 204s host configuration "save": 3600 1 300 100 60 10000 204s host configuration "appendonly": no 204s multi-thread: no 204s 204s Latency by percentile distribution: 204s 0.000% <= 0.159 milliseconds (cumulative count 10) 204s 50.000% <= 0.399 milliseconds (cumulative count 51930) 204s 75.000% <= 0.439 milliseconds (cumulative count 76160) 204s 87.500% <= 0.463 milliseconds (cumulative count 89320) 204s 93.750% <= 0.479 milliseconds (cumulative count 94780) 204s 96.875% <= 0.503 milliseconds (cumulative count 97030) 204s 98.438% <= 0.559 milliseconds (cumulative count 98470) 204s 99.219% <= 0.631 milliseconds (cumulative count 99220) 204s 99.609% <= 0.735 milliseconds (cumulative count 99610) 204s 99.805% <= 0.807 milliseconds (cumulative count 99810) 204s 99.902% <= 0.863 milliseconds (cumulative count 99910) 204s 99.951% <= 0.903 milliseconds (cumulative count 99960) 204s 99.976% <= 0.919 milliseconds (cumulative count 99980) 204s 99.988% <= 0.935 milliseconds (cumulative count 99990) 204s 99.994% <= 1.015 milliseconds (cumulative count 100000) 204s 100.000% <= 1.015 milliseconds (cumulative count 100000) 204s 204s Cumulative distribution of latencies: 204s 0.000% <= 0.103 milliseconds (cumulative count 0) 204s 0.070% <= 0.207 milliseconds (cumulative count 70) 204s 0.210% <= 0.303 milliseconds (cumulative count 210) 204s 60.250% <= 0.407 milliseconds (cumulative count 60250) 204s 97.030% <= 0.503 milliseconds (cumulative count 97030) 204s 99.020% <= 0.607 milliseconds (cumulative count 99020) 204s 99.530% <= 0.703 milliseconds (cumulative count 99530) 204s 99.810% <= 0.807 milliseconds (cumulative count 99810) 204s 99.960% <= 0.903 milliseconds (cumulative count 99960) 204s 99.990% <= 1.007 milliseconds (cumulative count 99990) 204s 100.000% <= 1.103 milliseconds (cumulative count 100000) 204s 204s Summary: 204s throughput summary: 641025.62 requests per second 204s latency summary (msec): 204s avg min p50 p95 p99 max 204s 0.413 0.152 0.399 0.487 0.607 1.015 204s ZADD: rps=120160.0 (overall: 639148.9) avg_msec=0.567 (overall: 0.567) ====== ZADD ====== 204s 100000 requests completed in 0.16 seconds 204s 50 parallel clients 204s 3 bytes payload 204s keep alive: 1 204s host configuration "save": 3600 1 300 100 60 10000 204s host configuration "appendonly": no 204s multi-thread: no 204s 204s Latency by percentile distribution: 204s 0.000% <= 0.223 milliseconds (cumulative count 10) 204s 50.000% <= 0.463 milliseconds (cumulative count 51170) 204s 75.000% <= 0.535 milliseconds (cumulative count 75360) 204s 87.500% <= 0.623 milliseconds (cumulative count 87580) 204s 93.750% <= 0.711 milliseconds (cumulative count 94080) 204s 96.875% <= 0.791 milliseconds (cumulative count 96890) 204s 98.438% <= 0.879 milliseconds (cumulative count 98470) 204s 99.219% <= 0.943 milliseconds (cumulative count 99310) 204s 99.609% <= 0.975 milliseconds (cumulative count 99640) 204s 99.805% <= 1.023 milliseconds (cumulative count 99820) 204s 99.902% <= 1.095 milliseconds (cumulative count 99910) 204s 99.951% <= 1.127 milliseconds (cumulative count 99960) 204s 99.976% <= 1.167 milliseconds (cumulative count 99980) 204s 99.988% <= 1.183 milliseconds (cumulative count 99990) 204s 99.994% <= 1.231 milliseconds (cumulative count 100000) 204s 100.000% <= 1.231 milliseconds (cumulative count 100000) 204s 204s Cumulative distribution of latencies: 204s 0.000% <= 0.103 milliseconds (cumulative count 0) 204s 0.180% <= 0.303 milliseconds (cumulative count 180) 204s 23.790% <= 0.407 milliseconds (cumulative count 23790) 204s 66.720% <= 0.503 milliseconds (cumulative count 66720) 204s 86.010% <= 0.607 milliseconds (cumulative count 86010) 204s 93.560% <= 0.703 milliseconds (cumulative count 93560) 204s 97.260% <= 0.807 milliseconds (cumulative count 97260) 204s 98.790% <= 0.903 milliseconds (cumulative count 98790) 204s 99.790% <= 1.007 milliseconds (cumulative count 99790) 204s 99.910% <= 1.103 milliseconds (cumulative count 99910) 204s 99.990% <= 1.207 milliseconds (cumulative count 99990) 204s 100.000% <= 1.303 milliseconds (cumulative count 100000) 204s 204s Summary: 204s throughput summary: 636942.62 requests per second 204s latency summary (msec): 204s avg min p50 p95 p99 max 204s 0.491 0.216 0.463 0.735 0.919 1.231 205s ZPOPMIN: rps=338964.2 (overall: 612086.3) avg_msec=0.427 (overall: 0.427) ====== ZPOPMIN ====== 205s 100000 requests completed in 0.16 seconds 205s 50 parallel clients 205s 3 bytes payload 205s keep alive: 1 205s host configuration "save": 3600 1 300 100 60 10000 205s host configuration "appendonly": no 205s multi-thread: no 205s 205s Latency by percentile distribution: 205s 0.000% <= 0.135 milliseconds (cumulative count 10) 205s 50.000% <= 0.407 milliseconds (cumulative count 54430) 205s 75.000% <= 0.455 milliseconds (cumulative count 75120) 205s 87.500% <= 0.487 milliseconds (cumulative count 87770) 205s 93.750% <= 0.527 milliseconds (cumulative count 93880) 205s 96.875% <= 0.567 milliseconds (cumulative count 96890) 205s 98.438% <= 0.615 milliseconds (cumulative count 98560) 205s 99.219% <= 0.671 milliseconds (cumulative count 99290) 205s 99.609% <= 0.759 milliseconds (cumulative count 99630) 205s 99.805% <= 0.895 milliseconds (cumulative count 99810) 205s 99.902% <= 0.919 milliseconds (cumulative count 99950) 205s 99.951% <= 0.927 milliseconds (cumulative count 100000) 205s 100.000% <= 0.927 milliseconds (cumulative count 100000) 205s 205s Cumulative distribution of latencies: 205s 0.000% <= 0.103 milliseconds (cumulative count 0) 205s 0.080% <= 0.207 milliseconds (cumulative count 80) 205s 0.220% <= 0.303 milliseconds (cumulative count 220) 205s 54.430% <= 0.407 milliseconds (cumulative count 54430) 205s 90.890% <= 0.503 milliseconds (cumulative count 90890) 205s 98.430% <= 0.607 milliseconds (cumulative count 98430) 205s 99.450% <= 0.703 milliseconds (cumulative count 99450) 205s 99.740% <= 0.807 milliseconds (cumulative count 99740) 205s 99.860% <= 0.903 milliseconds (cumulative count 99860) 205s 100.000% <= 1.007 milliseconds (cumulative count 100000) 205s 205s Summary: 205s throughput summary: 617283.94 requests per second 205s latency summary (msec): 205s avg min p50 p95 p99 max 205s 0.425 0.128 0.407 0.543 0.655 0.927 205s ====== LPUSH (needed to benchmark LRANGE) ====== 205s 100000 requests completed in 0.16 seconds 205s 50 parallel clients 205s 3 bytes payload 205s keep alive: 1 205s host configuration "save": 3600 1 300 100 60 10000 205s host configuration "appendonly": no 205s multi-thread: no 205s 205s Latency by percentile distribution: 205s 0.000% <= 0.191 milliseconds (cumulative count 10) 205s 50.000% <= 0.439 milliseconds (cumulative count 52930) 205s 75.000% <= 0.511 milliseconds (cumulative count 75810) 205s 87.500% <= 0.591 milliseconds (cumulative count 87960) 205s 93.750% <= 0.695 milliseconds (cumulative count 93950) 205s 96.875% <= 0.831 milliseconds (cumulative count 96890) 205s 98.438% <= 0.991 milliseconds (cumulative count 98500) 205s 99.219% <= 1.103 milliseconds (cumulative count 99250) 205s 99.609% <= 1.255 milliseconds (cumulative count 99630) 205s 99.805% <= 1.351 milliseconds (cumulative count 99810) 205s 99.902% <= 1.407 milliseconds (cumulative count 99910) 205s 99.951% <= 1.479 milliseconds (cumulative count 99960) 205s 99.976% <= 1.519 milliseconds (cumulative count 99980) 205s 99.988% <= 1.559 milliseconds (cumulative count 99990) 205s 99.994% <= 1.823 milliseconds (cumulative count 100000) 205s 100.000% <= 1.823 milliseconds (cumulative count 100000) 205s 205s Cumulative distribution of latencies: 205s 0.000% <= 0.103 milliseconds (cumulative count 0) 205s 0.020% <= 0.207 milliseconds (cumulative count 20) 205s 0.160% <= 0.303 milliseconds (cumulative count 160) 205s 34.090% <= 0.407 milliseconds (cumulative count 34090) 205s 73.910% <= 0.503 milliseconds (cumulative count 73910) 205s 89.340% <= 0.607 milliseconds (cumulative count 89340) 205s 94.200% <= 0.703 milliseconds (cumulative count 94200) 205s 96.450% <= 0.807 milliseconds (cumulative count 96450) 205s 97.750% <= 0.903 milliseconds (cumulative count 97750) 205s 98.650% <= 1.007 milliseconds (cumulative count 98650) 205s 99.250% <= 1.103 milliseconds (cumulative count 99250) 205s 99.530% <= 1.207 milliseconds (cumulative count 99530) 205s 99.720% <= 1.303 milliseconds (cumulative count 99720) 205s 99.910% <= 1.407 milliseconds (cumulative count 99910) 205s 99.970% <= 1.503 milliseconds (cumulative count 99970) 205s 99.990% <= 1.607 milliseconds (cumulative count 99990) 205s 100.000% <= 1.903 milliseconds (cumulative count 100000) 205s 205s Summary: 205s throughput summary: 617283.94 requests per second 205s latency summary (msec): 205s avg min p50 p95 p99 max 205s 0.477 0.184 0.439 0.735 1.055 1.823 206s LRANGE_100 (first 100 elements): rps=23864.5 (overall: 98196.7) avg_msec=2.758 (overall: 2.758) LRANGE_100 (first 100 elements): rps=101553.8 (overall: 100897.4) avg_msec=2.502 (overall: 2.551) LRANGE_100 (first 100 elements): rps=102173.9 (overall: 101469.0) avg_msec=2.435 (overall: 2.498) LRANGE_100 (first 100 elements): rps=102629.5 (overall: 101826.0) avg_msec=2.426 (overall: 2.476) ====== LRANGE_100 (first 100 elements) ====== 206s 100000 requests completed in 0.98 seconds 206s 50 parallel clients 206s 3 bytes payload 206s keep alive: 1 206s host configuration "save": 3600 1 300 100 60 10000 206s host configuration "appendonly": no 206s multi-thread: no 206s 206s Latency by percentile distribution: 206s 0.000% <= 0.575 milliseconds (cumulative count 10) 206s 50.000% <= 2.439 milliseconds (cumulative count 50380) 206s 75.000% <= 2.551 milliseconds (cumulative count 75240) 206s 87.500% <= 2.623 milliseconds (cumulative count 87790) 206s 93.750% <= 2.703 milliseconds (cumulative count 94050) 206s 96.875% <= 2.847 milliseconds (cumulative count 96980) 206s 98.438% <= 3.231 milliseconds (cumulative count 98440) 206s 99.219% <= 4.335 milliseconds (cumulative count 99220) 206s 99.609% <= 5.567 milliseconds (cumulative count 99610) 206s 99.805% <= 6.327 milliseconds (cumulative count 99810) 206s 99.902% <= 6.687 milliseconds (cumulative count 99910) 206s 99.951% <= 6.991 milliseconds (cumulative count 99960) 206s 99.976% <= 7.215 milliseconds (cumulative count 99980) 206s 99.988% <= 7.319 milliseconds (cumulative count 99990) 206s 99.994% <= 7.415 milliseconds (cumulative count 100000) 206s 100.000% <= 7.415 milliseconds (cumulative count 100000) 206s 206s Cumulative distribution of latencies: 206s 0.000% <= 0.103 milliseconds (cumulative count 0) 206s 0.010% <= 0.607 milliseconds (cumulative count 10) 206s 0.020% <= 0.703 milliseconds (cumulative count 20) 206s 0.030% <= 0.903 milliseconds (cumulative count 30) 206s 0.040% <= 1.007 milliseconds (cumulative count 40) 206s 0.050% <= 1.103 milliseconds (cumulative count 50) 206s 0.070% <= 1.207 milliseconds (cumulative count 70) 206s 0.090% <= 1.303 milliseconds (cumulative count 90) 206s 0.120% <= 1.407 milliseconds (cumulative count 120) 206s 0.150% <= 1.503 milliseconds (cumulative count 150) 206s 0.200% <= 1.607 milliseconds (cumulative count 200) 206s 0.220% <= 1.703 milliseconds (cumulative count 220) 206s 0.290% <= 1.807 milliseconds (cumulative count 290) 206s 0.370% <= 1.903 milliseconds (cumulative count 370) 206s 0.430% <= 2.007 milliseconds (cumulative count 430) 206s 1.280% <= 2.103 milliseconds (cumulative count 1280) 206s 98.190% <= 3.103 milliseconds (cumulative count 98190) 206s 99.130% <= 4.103 milliseconds (cumulative count 99130) 206s 99.460% <= 5.103 milliseconds (cumulative count 99460) 206s 99.730% <= 6.103 milliseconds (cumulative count 99730) 206s 99.970% <= 7.103 milliseconds (cumulative count 99970) 206s 100.000% <= 8.103 milliseconds (cumulative count 100000) 206s 206s Summary: 206s throughput summary: 101832.99 requests per second 206s latency summary (msec): 206s avg min p50 p95 p99 max 206s 2.470 0.568 2.439 2.735 3.807 7.415 209s LRANGE_300 (first 300 elements): rps=9848.6 (overall: 30146.3) avg_msec=8.445 (overall: 8.445) LRANGE_300 (first 300 elements): rps=32063.2 (overall: 31594.0) avg_msec=7.430 (overall: 7.667) LRANGE_300 (first 300 elements): rps=28608.7 (overall: 30309.5) avg_msec=9.813 (overall: 8.539) LRANGE_300 (first 300 elements): rps=23792.8 (overall: 28360.0) avg_msec=12.384 (overall: 9.504) LRANGE_300 (first 300 elements): rps=31015.7 (overall: 28977.1) avg_msec=9.148 (overall: 9.416) LRANGE_300 (first 300 elements): rps=31450.2 (overall: 29439.0) avg_msec=7.958 (overall: 9.125) LRANGE_300 (first 300 elements): rps=31211.2 (overall: 29717.9) avg_msec=8.014 (overall: 8.941) LRANGE_300 (first 300 elements): rps=30595.2 (overall: 29837.6) avg_msec=8.269 (overall: 8.847) LRANGE_300 (first 300 elements): rps=31716.5 (overall: 30064.7) avg_msec=7.828 (overall: 8.717) LRANGE_300 (first 300 elements): rps=28790.5 (overall: 29927.8) avg_msec=9.287 (overall: 8.776) LRANGE_300 (first 300 elements): rps=31349.2 (overall: 30065.2) avg_msec=7.701 (overall: 8.668) LRANGE_300 (first 300 elements): rps=31420.6 (overall: 30184.7) avg_msec=7.912 (overall: 8.598) LRANGE_300 (first 300 elements): rps=31944.4 (overall: 30327.3) avg_msec=7.661 (overall: 8.518) ====== LRANGE_300 (first 300 elements) ====== 209s 100000 requests completed in 3.29 seconds 209s 50 parallel clients 209s 3 bytes payload 209s keep alive: 1 209s host configuration "save": 3600 1 300 100 60 10000 209s host configuration "appendonly": no 209s multi-thread: no 209s 209s Latency by percentile distribution: 209s 0.000% <= 0.551 milliseconds (cumulative count 10) 209s 50.000% <= 7.831 milliseconds (cumulative count 50110) 209s 75.000% <= 9.623 milliseconds (cumulative count 75030) 209s 87.500% <= 11.855 milliseconds (cumulative count 87520) 209s 93.750% <= 14.087 milliseconds (cumulative count 93750) 209s 96.875% <= 16.735 milliseconds (cumulative count 96880) 209s 98.438% <= 19.087 milliseconds (cumulative count 98440) 209s 99.219% <= 22.015 milliseconds (cumulative count 99220) 209s 99.609% <= 24.287 milliseconds (cumulative count 99620) 209s 99.805% <= 25.471 milliseconds (cumulative count 99810) 209s 99.902% <= 26.191 milliseconds (cumulative count 99910) 209s 99.951% <= 26.671 milliseconds (cumulative count 99960) 209s 99.976% <= 27.999 milliseconds (cumulative count 99980) 209s 99.988% <= 28.223 milliseconds (cumulative count 99990) 209s 99.994% <= 28.415 milliseconds (cumulative count 100000) 209s 100.000% <= 28.415 milliseconds (cumulative count 100000) 209s 209s Cumulative distribution of latencies: 209s 0.000% <= 0.103 milliseconds (cumulative count 0) 209s 0.010% <= 0.607 milliseconds (cumulative count 10) 209s 0.020% <= 0.703 milliseconds (cumulative count 20) 209s 0.030% <= 0.807 milliseconds (cumulative count 30) 209s 0.060% <= 0.903 milliseconds (cumulative count 60) 209s 0.120% <= 1.103 milliseconds (cumulative count 120) 209s 0.150% <= 1.207 milliseconds (cumulative count 150) 209s 0.240% <= 1.303 milliseconds (cumulative count 240) 209s 0.310% <= 1.407 milliseconds (cumulative count 310) 209s 0.350% <= 1.503 milliseconds (cumulative count 350) 209s 0.420% <= 1.607 milliseconds (cumulative count 420) 209s 0.440% <= 1.703 milliseconds (cumulative count 440) 209s 0.480% <= 1.807 milliseconds (cumulative count 480) 209s 0.520% <= 1.903 milliseconds (cumulative count 520) 209s 0.560% <= 2.007 milliseconds (cumulative count 560) 209s 0.580% <= 2.103 milliseconds (cumulative count 580) 209s 0.970% <= 3.103 milliseconds (cumulative count 970) 209s 3.510% <= 4.103 milliseconds (cumulative count 3510) 209s 8.530% <= 5.103 milliseconds (cumulative count 8530) 209s 20.260% <= 6.103 milliseconds (cumulative count 20260) 209s 37.340% <= 7.103 milliseconds (cumulative count 37340) 209s 55.030% <= 8.103 milliseconds (cumulative count 55030) 209s 69.530% <= 9.103 milliseconds (cumulative count 69530) 209s 78.740% <= 10.103 milliseconds (cumulative count 78740) 209s 84.170% <= 11.103 milliseconds (cumulative count 84170) 209s 88.470% <= 12.103 milliseconds (cumulative count 88470) 209s 91.940% <= 13.103 milliseconds (cumulative count 91940) 209s 93.760% <= 14.103 milliseconds (cumulative count 93760) 209s 95.170% <= 15.103 milliseconds (cumulative count 95170) 209s 96.270% <= 16.103 milliseconds (cumulative count 96270) 209s 97.200% <= 17.103 milliseconds (cumulative count 97200) 209s 97.970% <= 18.111 milliseconds (cumulative count 97970) 209s 98.440% <= 19.103 milliseconds (cumulative count 98440) 209s 98.800% <= 20.111 milliseconds (cumulative count 98800) 209s 99.030% <= 21.103 milliseconds (cumulative count 99030) 209s 99.240% <= 22.111 milliseconds (cumulative count 99240) 209s 99.460% <= 23.103 milliseconds (cumulative count 99460) 209s 99.600% <= 24.111 milliseconds (cumulative count 99600) 209s 99.760% <= 25.103 milliseconds (cumulative count 99760) 209s 99.900% <= 26.111 milliseconds (cumulative count 99900) 209s 99.960% <= 27.103 milliseconds (cumulative count 99960) 209s 99.980% <= 28.111 milliseconds (cumulative count 99980) 209s 100.000% <= 29.103 milliseconds (cumulative count 100000) 209s 209s Summary: 209s throughput summary: 30404.38 requests per second 209s latency summary (msec): 209s avg min p50 p95 p99 max 209s 8.463 0.544 7.831 14.975 20.879 28.415 216s LRANGE_500 (first 500 elements): rps=4147.3 (overall: 13896.1) avg_msec=17.720 (overall: 17.720) LRANGE_500 (first 500 elements): rps=12736.4 (overall: 13003.0) avg_msec=21.282 (overall: 20.407) LRANGE_500 (first 500 elements): rps=12696.5 (overall: 12869.9) avg_msec=20.094 (overall: 20.273) LRANGE_500 (first 500 elements): rps=16428.0 (overall: 13926.4) avg_msec=15.778 (overall: 18.699) LRANGE_500 (first 500 elements): rps=12664.0 (overall: 13637.4) avg_msec=21.047 (overall: 19.198) LRANGE_500 (first 500 elements): rps=14095.6 (overall: 13723.0) avg_msec=17.883 (overall: 18.946) LRANGE_500 (first 500 elements): rps=17897.6 (overall: 14387.0) avg_msec=14.180 (overall: 18.003) LRANGE_500 (first 500 elements): rps=16733.1 (overall: 14705.6) avg_msec=14.386 (overall: 17.444) LRANGE_500 (first 500 elements): rps=12924.3 (overall: 14492.6) avg_msec=20.684 (overall: 17.789) LRANGE_500 (first 500 elements): rps=15158.7 (overall: 14564.0) avg_msec=17.977 (overall: 17.810) LRANGE_500 (first 500 elements): rps=14103.2 (overall: 14519.4) avg_msec=18.242 (overall: 17.851) LRANGE_500 (first 500 elements): rps=13391.3 (overall: 14419.5) avg_msec=20.658 (overall: 18.082) LRANGE_500 (first 500 elements): rps=16533.9 (overall: 14590.3) avg_msec=15.514 (overall: 17.847) LRANGE_500 (first 500 elements): rps=15928.3 (overall: 14690.3) avg_msec=16.751 (overall: 17.758) LRANGE_500 (first 500 elements): rps=13506.0 (overall: 14607.9) avg_msec=19.577 (overall: 17.875) LRANGE_500 (first 500 elements): rps=15570.9 (overall: 14671.2) avg_msec=17.583 (overall: 17.854) LRANGE_500 (first 500 elements): rps=18792.0 (overall: 14921.7) avg_msec=10.920 (overall: 17.324) LRANGE_500 (first 500 elements): rps=16402.4 (overall: 15006.9) avg_msec=15.394 (overall: 17.202) LRANGE_500 (first 500 elements): rps=12889.3 (overall: 14890.8) avg_msec=22.198 (overall: 17.439) LRANGE_500 (first 500 elements): rps=15454.2 (overall: 14919.9) avg_msec=17.179 (overall: 17.425) LRANGE_500 (first 500 elements): rps=15737.1 (overall: 14960.0) avg_msec=16.116 (overall: 17.358) LRANGE_500 (first 500 elements): rps=12788.8 (overall: 14858.5) avg_msec=20.669 (overall: 17.491) LRANGE_500 (first 500 elements): rps=16055.8 (overall: 14911.9) avg_msec=17.126 (overall: 17.473) LRANGE_500 (first 500 elements): rps=14023.9 (overall: 14874.0) avg_msec=18.394 (overall: 17.511) LRANGE_500 (first 500 elements): rps=13520.0 (overall: 14818.7) avg_msec=20.344 (overall: 17.616) LRANGE_500 (first 500 elements): rps=12948.2 (overall: 14745.0) avg_msec=20.522 (overall: 17.717) LRANGE_500 (first 500 elements): rps=12980.2 (overall: 14677.9) avg_msec=20.356 (overall: 17.805) ====== LRANGE_500 (first 500 elements) ====== 216s 100000 requests completed in 6.83 seconds 216s 50 parallel clients 216s 3 bytes payload 216s keep alive: 1 216s host configuration "save": 3600 1 300 100 60 10000 216s host configuration "appendonly": no 216s multi-thread: no 216s 216s Latency by percentile distribution: 216s 0.000% <= 0.599 milliseconds (cumulative count 10) 216s 50.000% <= 18.079 milliseconds (cumulative count 50030) 216s 75.000% <= 22.847 milliseconds (cumulative count 75070) 216s 87.500% <= 26.607 milliseconds (cumulative count 87500) 216s 93.750% <= 29.167 milliseconds (cumulative count 93760) 216s 96.875% <= 30.687 milliseconds (cumulative count 96900) 216s 98.438% <= 32.367 milliseconds (cumulative count 98440) 216s 99.219% <= 35.583 milliseconds (cumulative count 99220) 216s 99.609% <= 37.055 milliseconds (cumulative count 99610) 216s 99.805% <= 38.079 milliseconds (cumulative count 99810) 216s 99.902% <= 38.879 milliseconds (cumulative count 99910) 216s 99.951% <= 39.615 milliseconds (cumulative count 99960) 216s 99.976% <= 39.999 milliseconds (cumulative count 99980) 216s 99.988% <= 40.159 milliseconds (cumulative count 99990) 216s 99.994% <= 40.351 milliseconds (cumulative count 100000) 216s 100.000% <= 40.351 milliseconds (cumulative count 100000) 216s 216s Cumulative distribution of latencies: 216s 0.000% <= 0.103 milliseconds (cumulative count 0) 216s 0.010% <= 0.607 milliseconds (cumulative count 10) 216s 0.030% <= 0.903 milliseconds (cumulative count 30) 216s 0.130% <= 1.007 milliseconds (cumulative count 130) 216s 0.150% <= 1.103 milliseconds (cumulative count 150) 216s 0.320% <= 1.207 milliseconds (cumulative count 320) 216s 0.430% <= 1.303 milliseconds (cumulative count 430) 216s 0.540% <= 1.407 milliseconds (cumulative count 540) 216s 0.880% <= 1.503 milliseconds (cumulative count 880) 216s 1.060% <= 1.607 milliseconds (cumulative count 1060) 216s 1.360% <= 1.703 milliseconds (cumulative count 1360) 216s 1.550% <= 1.807 milliseconds (cumulative count 1550) 216s 1.730% <= 1.903 milliseconds (cumulative count 1730) 216s 1.840% <= 2.007 milliseconds (cumulative count 1840) 216s 1.890% <= 2.103 milliseconds (cumulative count 1890) 216s 2.950% <= 3.103 milliseconds (cumulative count 2950) 216s 4.260% <= 4.103 milliseconds (cumulative count 4260) 216s 5.210% <= 5.103 milliseconds (cumulative count 5210) 216s 6.850% <= 6.103 milliseconds (cumulative count 6850) 216s 8.930% <= 7.103 milliseconds (cumulative count 8930) 216s 11.440% <= 8.103 milliseconds (cumulative count 11440) 216s 14.160% <= 9.103 milliseconds (cumulative count 14160) 216s 16.990% <= 10.103 milliseconds (cumulative count 16990) 216s 19.660% <= 11.103 milliseconds (cumulative count 19660) 216s 22.340% <= 12.103 milliseconds (cumulative count 22340) 216s 25.550% <= 13.103 milliseconds (cumulative count 25550) 216s 28.970% <= 14.103 milliseconds (cumulative count 28970) 216s 32.690% <= 15.103 milliseconds (cumulative count 32690) 216s 37.940% <= 16.103 milliseconds (cumulative count 37940) 216s 44.290% <= 17.103 milliseconds (cumulative count 44290) 216s 50.230% <= 18.111 milliseconds (cumulative count 50230) 216s 55.680% <= 19.103 milliseconds (cumulative count 55680) 216s 61.090% <= 20.111 milliseconds (cumulative count 61090) 216s 66.350% <= 21.103 milliseconds (cumulative count 66350) 216s 71.680% <= 22.111 milliseconds (cumulative count 71680) 216s 76.240% <= 23.103 milliseconds (cumulative count 76240) 216s 80.460% <= 24.111 milliseconds (cumulative count 80460) 216s 83.590% <= 25.103 milliseconds (cumulative count 83590) 216s 86.240% <= 26.111 milliseconds (cumulative count 86240) 216s 88.660% <= 27.103 milliseconds (cumulative count 88660) 216s 90.990% <= 28.111 milliseconds (cumulative count 90990) 216s 93.570% <= 29.103 milliseconds (cumulative count 93570) 216s 95.810% <= 30.111 milliseconds (cumulative count 95810) 216s 97.460% <= 31.103 milliseconds (cumulative count 97460) 216s 98.310% <= 32.111 milliseconds (cumulative count 98310) 216s 98.700% <= 33.119 milliseconds (cumulative count 98700) 216s 98.910% <= 34.111 milliseconds (cumulative count 98910) 216s 99.140% <= 35.103 milliseconds (cumulative count 99140) 216s 99.330% <= 36.127 milliseconds (cumulative count 99330) 216s 99.610% <= 37.119 milliseconds (cumulative count 99610) 216s 99.810% <= 38.111 milliseconds (cumulative count 99810) 216s 99.930% <= 39.103 milliseconds (cumulative count 99930) 216s 99.980% <= 40.127 milliseconds (cumulative count 99980) 216s 100.000% <= 41.119 milliseconds (cumulative count 100000) 216s 216s Summary: 216s throughput summary: 14647.72 requests per second 216s latency summary (msec): 216s avg min p50 p95 p99 max 216s 17.846 0.592 18.079 29.727 34.495 40.351 223s LRANGE_600 (first 600 elements): rps=1808.0 (overall: 9826.1) avg_msec=24.449 (overall: 24.449) LRANGE_600 (first 600 elements): rps=9231.1 (overall: 9323.2) avg_msec=28.396 (overall: 27.752) LRANGE_600 (first 600 elements): rps=12478.6 (overall: 10787.0) avg_msec=20.121 (overall: 23.657) LRANGE_600 (first 600 elements): rps=13897.6 (overall: 11764.9) avg_msec=16.894 (overall: 21.145) LRANGE_600 (first 600 elements): rps=10740.2 (overall: 11519.8) avg_msec=23.688 (overall: 21.712) LRANGE_600 (first 600 elements): rps=10146.7 (overall: 11250.6) avg_msec=25.266 (overall: 22.341) LRANGE_600 (first 600 elements): rps=10968.7 (overall: 11204.8) avg_msec=23.691 (overall: 22.555) LRANGE_600 (first 600 elements): rps=13091.6 (overall: 11463.9) avg_msec=19.034 (overall: 22.003) LRANGE_600 (first 600 elements): rps=14416.7 (overall: 11821.6) avg_msec=15.283 (overall: 21.010) LRANGE_600 (first 600 elements): rps=13649.8 (overall: 12022.7) avg_msec=17.771 (overall: 20.606) LRANGE_600 (first 600 elements): rps=15046.9 (overall: 12321.2) avg_msec=14.632 (overall: 19.885) LRANGE_600 (first 600 elements): rps=12682.4 (overall: 12353.6) avg_msec=18.127 (overall: 19.724) LRANGE_600 (first 600 elements): rps=10662.7 (overall: 12216.1) avg_msec=24.060 (overall: 20.031) LRANGE_600 (first 600 elements): rps=10342.6 (overall: 12075.8) avg_msec=24.703 (overall: 20.331) LRANGE_600 (first 600 elements): rps=13637.8 (overall: 12185.9) avg_msec=18.744 (overall: 20.206) LRANGE_600 (first 600 elements): rps=14302.0 (overall: 12325.6) avg_msec=16.493 (overall: 19.921) LRANGE_600 (first 600 elements): rps=15652.2 (overall: 12530.3) avg_msec=11.987 (overall: 19.312) LRANGE_600 (first 600 elements): rps=15908.0 (overall: 12723.8) avg_msec=11.659 (overall: 18.763) LRANGE_600 (first 600 elements): rps=12702.4 (overall: 12722.6) avg_msec=17.894 (overall: 18.716) LRANGE_600 (first 600 elements): rps=13737.1 (overall: 12775.0) avg_msec=18.221 (overall: 18.689) LRANGE_600 (first 600 elements): rps=14075.7 (overall: 12838.8) avg_msec=17.103 (overall: 18.603) LRANGE_600 (first 600 elements): rps=14267.7 (overall: 12906.3) avg_msec=18.622 (overall: 18.604) LRANGE_600 (first 600 elements): rps=15545.1 (overall: 13026.0) avg_msec=13.278 (overall: 18.316) LRANGE_600 (first 600 elements): rps=12328.0 (overall: 12996.3) avg_msec=20.088 (overall: 18.388) LRANGE_600 (first 600 elements): rps=11462.5 (overall: 12932.9) avg_msec=23.170 (overall: 18.563) LRANGE_600 (first 600 elements): rps=14537.3 (overall: 12997.0) avg_msec=15.944 (overall: 18.446) LRANGE_600 (first 600 elements): rps=13964.3 (overall: 13033.8) avg_msec=17.487 (overall: 18.407) LRANGE_600 (first 600 elements): rps=14776.9 (overall: 13097.3) avg_msec=14.874 (overall: 18.261) LRANGE_600 (first 600 elements): rps=14735.2 (overall: 13155.3) avg_msec=16.952 (overall: 18.209) LRANGE_600 (first 600 elements): rps=11492.2 (overall: 13097.3) avg_msec=21.312 (overall: 18.304) ====== LRANGE_600 (first 600 elements) ====== 223s 100000 requests completed in 7.63 seconds 223s 50 parallel clients 223s 3 bytes payload 223s keep alive: 1 223s host configuration "save": 3600 1 300 100 60 10000 223s host configuration "appendonly": no 223s multi-thread: no 223s 223s Latency by percentile distribution: 223s 0.000% <= 0.623 milliseconds (cumulative count 10) 223s 50.000% <= 17.535 milliseconds (cumulative count 50040) 223s 75.000% <= 24.511 milliseconds (cumulative count 75050) 223s 87.500% <= 28.639 milliseconds (cumulative count 87520) 223s 93.750% <= 31.039 milliseconds (cumulative count 93780) 223s 96.875% <= 33.087 milliseconds (cumulative count 96880) 223s 98.438% <= 34.687 milliseconds (cumulative count 98450) 223s 99.219% <= 36.479 milliseconds (cumulative count 99240) 223s 99.609% <= 39.711 milliseconds (cumulative count 99610) 223s 99.805% <= 42.751 milliseconds (cumulative count 99810) 223s 99.902% <= 43.775 milliseconds (cumulative count 99910) 223s 99.951% <= 44.319 milliseconds (cumulative count 99960) 223s 99.976% <= 50.143 milliseconds (cumulative count 99980) 223s 99.988% <= 50.335 milliseconds (cumulative count 99990) 223s 99.994% <= 51.263 milliseconds (cumulative count 100000) 223s 100.000% <= 51.263 milliseconds (cumulative count 100000) 223s 223s Cumulative distribution of latencies: 223s 0.000% <= 0.103 milliseconds (cumulative count 0) 223s 0.010% <= 0.703 milliseconds (cumulative count 10) 223s 0.040% <= 1.103 milliseconds (cumulative count 40) 223s 0.100% <= 1.207 milliseconds (cumulative count 100) 223s 0.180% <= 1.303 milliseconds (cumulative count 180) 223s 0.280% <= 1.407 milliseconds (cumulative count 280) 223s 0.450% <= 1.503 milliseconds (cumulative count 450) 223s 0.580% <= 1.607 milliseconds (cumulative count 580) 223s 0.800% <= 1.703 milliseconds (cumulative count 800) 223s 0.900% <= 1.807 milliseconds (cumulative count 900) 223s 1.100% <= 1.903 milliseconds (cumulative count 1100) 223s 1.300% <= 2.007 milliseconds (cumulative count 1300) 223s 1.550% <= 2.103 milliseconds (cumulative count 1550) 223s 2.540% <= 3.103 milliseconds (cumulative count 2540) 223s 2.750% <= 4.103 milliseconds (cumulative count 2750) 223s 3.230% <= 5.103 milliseconds (cumulative count 3230) 223s 4.270% <= 6.103 milliseconds (cumulative count 4270) 223s 5.370% <= 7.103 milliseconds (cumulative count 5370) 223s 7.530% <= 8.103 milliseconds (cumulative count 7530) 223s 10.400% <= 9.103 milliseconds (cumulative count 10400) 223s 14.430% <= 10.103 milliseconds (cumulative count 14430) 223s 19.950% <= 11.103 milliseconds (cumulative count 19950) 223s 25.570% <= 12.103 milliseconds (cumulative count 25570) 223s 31.100% <= 13.103 milliseconds (cumulative count 31100) 223s 35.780% <= 14.103 milliseconds (cumulative count 35780) 223s 40.120% <= 15.103 milliseconds (cumulative count 40120) 223s 44.150% <= 16.103 milliseconds (cumulative count 44150) 223s 48.340% <= 17.103 milliseconds (cumulative count 48340) 223s 52.450% <= 18.111 milliseconds (cumulative count 52450) 223s 56.170% <= 19.103 milliseconds (cumulative count 56170) 223s 59.810% <= 20.111 milliseconds (cumulative count 59810) 223s 63.320% <= 21.103 milliseconds (cumulative count 63320) 223s 66.790% <= 22.111 milliseconds (cumulative count 66790) 223s 70.470% <= 23.103 milliseconds (cumulative count 70470) 223s 73.790% <= 24.111 milliseconds (cumulative count 73790) 223s 77.000% <= 25.103 milliseconds (cumulative count 77000) 223s 80.370% <= 26.111 milliseconds (cumulative count 80370) 223s 83.420% <= 27.103 milliseconds (cumulative count 83420) 223s 86.090% <= 28.111 milliseconds (cumulative count 86090) 223s 88.820% <= 29.103 milliseconds (cumulative count 88820) 223s 91.550% <= 30.111 milliseconds (cumulative count 91550) 223s 93.910% <= 31.103 milliseconds (cumulative count 93910) 223s 95.700% <= 32.111 milliseconds (cumulative count 95700) 223s 96.910% <= 33.119 milliseconds (cumulative count 96910) 223s 98.010% <= 34.111 milliseconds (cumulative count 98010) 223s 98.690% <= 35.103 milliseconds (cumulative count 98690) 223s 99.130% <= 36.127 milliseconds (cumulative count 99130) 223s 99.440% <= 37.119 milliseconds (cumulative count 99440) 223s 99.540% <= 38.111 milliseconds (cumulative count 99540) 223s 99.600% <= 39.103 milliseconds (cumulative count 99600) 223s 99.620% <= 40.127 milliseconds (cumulative count 99620) 223s 99.700% <= 41.119 milliseconds (cumulative count 99700) 223s 99.750% <= 42.111 milliseconds (cumulative count 99750) 223s 99.840% <= 43.103 milliseconds (cumulative count 99840) 223s 99.940% <= 44.127 milliseconds (cumulative count 99940) 223s 99.960% <= 45.119 milliseconds (cumulative count 99960) 223s 99.970% <= 50.111 milliseconds (cumulative count 99970) 223s 99.990% <= 51.103 milliseconds (cumulative count 99990) 223s 100.000% <= 52.127 milliseconds (cumulative count 100000) 223s 223s Summary: 223s throughput summary: 13102.73 requests per second 223s latency summary (msec): 223s avg min p50 p95 p99 max 223s 18.340 0.616 17.535 31.679 35.647 51.263 224s MSET (10 keys): rps=16080.0 (overall: 309230.8) avg_msec=1.370 (overall: 1.370) MSET (10 keys): rps=326560.0 (overall: 325703.4) avg_msec=1.387 (overall: 1.386) ====== MSET (10 keys) ====== 224s 100000 requests completed in 0.31 seconds 224s 50 parallel clients 224s 3 bytes payload 224s keep alive: 1 224s host configuration "save": 3600 1 300 100 60 10000 224s host configuration "appendonly": no 224s multi-thread: no 224s 224s Latency by percentile distribution: 224s 0.000% <= 0.415 milliseconds (cumulative count 10) 224s 50.000% <= 1.383 milliseconds (cumulative count 50790) 224s 75.000% <= 1.599 milliseconds (cumulative count 75490) 224s 87.500% <= 1.727 milliseconds (cumulative count 87770) 224s 93.750% <= 1.815 milliseconds (cumulative count 94000) 224s 96.875% <= 1.903 milliseconds (cumulative count 97000) 224s 98.438% <= 1.999 milliseconds (cumulative count 98450) 224s 99.219% <= 2.111 milliseconds (cumulative count 99240) 224s 99.609% <= 2.207 milliseconds (cumulative count 99610) 224s 99.805% <= 2.287 milliseconds (cumulative count 99820) 224s 99.902% <= 2.367 milliseconds (cumulative count 99910) 224s 99.951% <= 2.431 milliseconds (cumulative count 99960) 224s 99.976% <= 2.495 milliseconds (cumulative count 99980) 224s 99.988% <= 2.503 milliseconds (cumulative count 99990) 224s 99.994% <= 2.543 milliseconds (cumulative count 100000) 224s 100.000% <= 2.543 milliseconds (cumulative count 100000) 224s 224s Cumulative distribution of latencies: 224s 0.000% <= 0.103 milliseconds (cumulative count 0) 224s 0.030% <= 0.503 milliseconds (cumulative count 30) 224s 0.090% <= 0.607 milliseconds (cumulative count 90) 224s 0.260% <= 0.703 milliseconds (cumulative count 260) 224s 0.860% <= 0.807 milliseconds (cumulative count 860) 224s 2.560% <= 0.903 milliseconds (cumulative count 2560) 224s 6.340% <= 1.007 milliseconds (cumulative count 6340) 224s 18.530% <= 1.103 milliseconds (cumulative count 18530) 224s 30.450% <= 1.207 milliseconds (cumulative count 30450) 224s 41.440% <= 1.303 milliseconds (cumulative count 41440) 224s 53.930% <= 1.407 milliseconds (cumulative count 53930) 224s 65.710% <= 1.503 milliseconds (cumulative count 65710) 224s 76.340% <= 1.607 milliseconds (cumulative count 76340) 224s 85.510% <= 1.703 milliseconds (cumulative count 85510) 224s 93.520% <= 1.807 milliseconds (cumulative count 93520) 224s 97.000% <= 1.903 milliseconds (cumulative count 97000) 224s 98.530% <= 2.007 milliseconds (cumulative count 98530) 224s 99.150% <= 2.103 milliseconds (cumulative count 99150) 224s 100.000% <= 3.103 milliseconds (cumulative count 100000) 224s 224s Summary: 224s throughput summary: 325732.88 requests per second 224s latency summary (msec): 224s avg min p50 p95 p99 max 224s 1.386 0.408 1.383 1.839 2.079 2.543 224s XADD: rps=378127.5 (overall: 462975.6) avg_msec=0.911 (overall: 0.911) ====== XADD ====== 224s 100000 requests completed in 0.22 seconds 224s 50 parallel clients 224s 3 bytes payload 224s keep alive: 1 224s host configuration "save": 3600 1 300 100 60 10000 224s host configuration "appendonly": no 224s multi-thread: no 224s 224s Latency by percentile distribution: 224s 0.000% <= 0.287 milliseconds (cumulative count 10) 224s 50.000% <= 0.839 milliseconds (cumulative count 50860) 224s 75.000% <= 1.103 milliseconds (cumulative count 75150) 224s 87.500% <= 1.479 milliseconds (cumulative count 87520) 224s 93.750% <= 1.759 milliseconds (cumulative count 93970) 224s 96.875% <= 1.879 milliseconds (cumulative count 96920) 224s 98.438% <= 1.967 milliseconds (cumulative count 98480) 224s 99.219% <= 2.063 milliseconds (cumulative count 99250) 224s 99.609% <= 2.271 milliseconds (cumulative count 99610) 224s 99.805% <= 2.631 milliseconds (cumulative count 99810) 224s 99.902% <= 2.711 milliseconds (cumulative count 99910) 224s 99.951% <= 2.751 milliseconds (cumulative count 99960) 224s 99.976% <= 2.815 milliseconds (cumulative count 99980) 224s 99.988% <= 2.895 milliseconds (cumulative count 99990) 224s 99.994% <= 3.303 milliseconds (cumulative count 100000) 224s 100.000% <= 3.303 milliseconds (cumulative count 100000) 224s 224s Cumulative distribution of latencies: 224s 0.000% <= 0.103 milliseconds (cumulative count 0) 224s 0.020% <= 0.303 milliseconds (cumulative count 20) 224s 0.490% <= 0.407 milliseconds (cumulative count 490) 224s 5.770% <= 0.503 milliseconds (cumulative count 5770) 224s 16.630% <= 0.607 milliseconds (cumulative count 16630) 224s 30.670% <= 0.703 milliseconds (cumulative count 30670) 224s 46.720% <= 0.807 milliseconds (cumulative count 46720) 224s 58.130% <= 0.903 milliseconds (cumulative count 58130) 224s 67.540% <= 1.007 milliseconds (cumulative count 67540) 224s 75.150% <= 1.103 milliseconds (cumulative count 75150) 224s 82.130% <= 1.207 milliseconds (cumulative count 82130) 224s 85.470% <= 1.303 milliseconds (cumulative count 85470) 224s 86.730% <= 1.407 milliseconds (cumulative count 86730) 224s 87.880% <= 1.503 milliseconds (cumulative count 87880) 224s 89.940% <= 1.607 milliseconds (cumulative count 89940) 224s 92.410% <= 1.703 milliseconds (cumulative count 92410) 224s 95.220% <= 1.807 milliseconds (cumulative count 95220) 224s 97.430% <= 1.903 milliseconds (cumulative count 97430) 224s 99.000% <= 2.007 milliseconds (cumulative count 99000) 224s 99.380% <= 2.103 milliseconds (cumulative count 99380) 224s 99.990% <= 3.103 milliseconds (cumulative count 99990) 224s 100.000% <= 4.103 milliseconds (cumulative count 100000) 224s 224s Summary: 224s throughput summary: 452488.69 requests per second 224s latency summary (msec): 224s avg min p50 p95 p99 max 224s 0.941 0.280 0.839 1.799 2.007 3.303 224s 224s autopkgtest [02:15:32]: test 0002-benchmark: -----------------------] 225s autopkgtest [02:15:33]: test 0002-benchmark: - - - - - - - - - - results - - - - - - - - - - 225s 0002-benchmark PASS 225s autopkgtest [02:15:33]: test 0003-valkey-check-aof: preparing testbed 225s Reading package lists... 226s Building dependency tree... 226s Reading state information... 226s Starting pkgProblemResolver with broken count: 0 226s Starting 2 pkgProblemResolver with broken count: 0 226s Done 226s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 227s autopkgtest [02:15:35]: test 0003-valkey-check-aof: [----------------------- 228s autopkgtest [02:15:36]: test 0003-valkey-check-aof: -----------------------] 228s autopkgtest [02:15:36]: test 0003-valkey-check-aof: - - - - - - - - - - results - - - - - - - - - - 228s 0003-valkey-check-aof PASS 228s autopkgtest [02:15:36]: test 0004-valkey-check-rdb: preparing testbed 229s Reading package lists... 229s Building dependency tree... 229s Reading state information... 229s Starting pkgProblemResolver with broken count: 0 229s Starting 2 pkgProblemResolver with broken count: 0 229s Done 229s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 230s autopkgtest [02:15:38]: test 0004-valkey-check-rdb: [----------------------- 235s OK 236s [offset 0] Checking RDB file /var/lib/valkey/dump.rdb 236s [offset 27] AUX FIELD valkey-ver = '7.2.8' 236s [offset 41] AUX FIELD redis-bits = '64' 236s [offset 53] AUX FIELD ctime = '1751595343' 236s [offset 68] AUX FIELD used-mem = '3020856' 236s [offset 80] AUX FIELD aof-base = '0' 236s [offset 82] Selecting DB ID 0 236s [offset 565567] Checksum OK 236s [offset 565567] \o/ RDB looks OK! \o/ 236s [info] 5 keys read 236s [info] 0 expires 236s [info] 0 already expired 236s autopkgtest [02:15:44]: test 0004-valkey-check-rdb: -----------------------] 236s autopkgtest [02:15:44]: test 0004-valkey-check-rdb: - - - - - - - - - - results - - - - - - - - - - 236s 0004-valkey-check-rdb PASS 237s autopkgtest [02:15:45]: test 0005-cjson: preparing testbed 237s Reading package lists... 237s Building dependency tree... 237s Reading state information... 237s Starting pkgProblemResolver with broken count: 0 237s Starting 2 pkgProblemResolver with broken count: 0 237s Done 238s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 238s autopkgtest [02:15:46]: test 0005-cjson: [----------------------- 244s 244s autopkgtest [02:15:52]: test 0005-cjson: -----------------------] 245s autopkgtest [02:15:53]: test 0005-cjson: - - - - - - - - - - results - - - - - - - - - - 245s 0005-cjson PASS 245s autopkgtest [02:15:53]: test 0006-migrate-from-redis: preparing testbed 283s Creating nova instance adt-noble-amd64-valkey-20250704-021147-juju-7f2275-prod-proposed-migration-environment-21-30308ce2-47c3-4d53-b097-f9239f946bc0 from image adt/ubuntu-noble-amd64-server-20250703.img (UUID 841a84e8-df42-4fef-9073-9d50a10876b1)... 464s autopkgtest [02:19:32]: testbed dpkg architecture: amd64 465s autopkgtest [02:19:33]: testbed apt version: 2.8.3 465s autopkgtest [02:19:33]: @@@@@@@@@@@@@@@@@@@@ test bed setup 465s autopkgtest [02:19:33]: testbed release detected to be: noble 466s autopkgtest [02:19:34]: updating testbed package index (apt update) 466s Get:1 http://ftpmaster.internal/ubuntu noble-proposed InRelease [265 kB] 467s Hit:2 http://ftpmaster.internal/ubuntu noble InRelease 467s Hit:3 http://ftpmaster.internal/ubuntu noble-updates InRelease 467s Hit:4 http://ftpmaster.internal/ubuntu noble-security InRelease 467s Get:5 http://ftpmaster.internal/ubuntu noble-proposed/main Sources [65.3 kB] 467s Get:6 http://ftpmaster.internal/ubuntu noble-proposed/multiverse Sources [3948 B] 467s Get:7 http://ftpmaster.internal/ubuntu noble-proposed/universe Sources [63.8 kB] 467s Get:8 http://ftpmaster.internal/ubuntu noble-proposed/restricted Sources [28.9 kB] 467s Get:9 http://ftpmaster.internal/ubuntu noble-proposed/main i386 Packages [59.3 kB] 467s Get:10 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 Packages [283 kB] 467s Get:11 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 c-n-f Metadata [2248 B] 467s Get:12 http://ftpmaster.internal/ubuntu noble-proposed/restricted i386 Packages [9812 B] 467s Get:13 http://ftpmaster.internal/ubuntu noble-proposed/restricted amd64 Packages [423 kB] 467s Get:14 http://ftpmaster.internal/ubuntu noble-proposed/restricted amd64 c-n-f Metadata [116 B] 467s Get:15 http://ftpmaster.internal/ubuntu noble-proposed/universe amd64 Packages [452 kB] 467s Get:16 http://ftpmaster.internal/ubuntu noble-proposed/universe i386 Packages [348 kB] 467s Get:17 http://ftpmaster.internal/ubuntu noble-proposed/universe amd64 c-n-f Metadata [7448 B] 467s Get:18 http://ftpmaster.internal/ubuntu noble-proposed/multiverse i386 Packages [752 B] 467s Get:19 http://ftpmaster.internal/ubuntu noble-proposed/multiverse amd64 Packages [5264 B] 467s Get:20 http://ftpmaster.internal/ubuntu noble-proposed/multiverse amd64 c-n-f Metadata [116 B] 470s Fetched 2018 kB in 1s (2059 kB/s) 471s Reading package lists... 472s autopkgtest [02:19:40]: upgrading testbed (apt dist-upgrade and autopurge) 472s Reading package lists... 472s Building dependency tree... 472s Reading state information... 473s Calculating upgrade...Starting pkgProblemResolver with broken count: 0 473s Starting 2 pkgProblemResolver with broken count: 0 473s Done 473s Entering ResolveByKeep 474s 474s The following packages will be upgraded: 474s gzip libnetplan1 libnss-systemd libpam-systemd libsystemd-shared libsystemd0 474s libudev1 netplan-generator netplan.io openssh-client openssh-server 474s openssh-sftp-server python3-netplan systemd systemd-dev systemd-resolved 474s systemd-sysv systemd-timesyncd udev 474s 19 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 474s Need to get 10.7 MB of archives. 474s After this operation, 34.8 kB of additional disk space will be used. 474s Get:1 http://ftpmaster.internal/ubuntu noble-updates/main amd64 gzip amd64 1.12-1ubuntu3.1 [99.0 kB] 474s Get:2 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 libnss-systemd amd64 255.4-1ubuntu8.10 [159 kB] 474s Get:3 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 systemd-dev all 255.4-1ubuntu8.10 [105 kB] 474s Get:4 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 systemd-timesyncd amd64 255.4-1ubuntu8.10 [35.3 kB] 474s Get:5 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 systemd-resolved amd64 255.4-1ubuntu8.10 [296 kB] 474s Get:6 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 libsystemd-shared amd64 255.4-1ubuntu8.10 [2074 kB] 475s Get:7 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 libsystemd0 amd64 255.4-1ubuntu8.10 [434 kB] 475s Get:8 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 systemd-sysv amd64 255.4-1ubuntu8.10 [11.9 kB] 475s Get:9 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 libpam-systemd amd64 255.4-1ubuntu8.10 [235 kB] 475s Get:10 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 systemd amd64 255.4-1ubuntu8.10 [3475 kB] 475s Get:11 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 udev amd64 255.4-1ubuntu8.10 [1873 kB] 475s Get:12 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 libudev1 amd64 255.4-1ubuntu8.10 [176 kB] 475s Get:13 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 openssh-sftp-server amd64 1:9.6p1-3ubuntu13.13 [37.1 kB] 475s Get:14 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 openssh-server amd64 1:9.6p1-3ubuntu13.13 [510 kB] 475s Get:15 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 openssh-client amd64 1:9.6p1-3ubuntu13.13 [906 kB] 475s Get:16 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 python3-netplan amd64 1.1.2-2~ubuntu24.04.2 [24.3 kB] 475s Get:17 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 netplan-generator amd64 1.1.2-2~ubuntu24.04.2 [61.1 kB] 475s Get:18 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 netplan.io amd64 1.1.2-2~ubuntu24.04.2 [69.7 kB] 475s Get:19 http://ftpmaster.internal/ubuntu noble-proposed/main amd64 libnetplan1 amd64 1.1.2-2~ubuntu24.04.2 [132 kB] 476s Preconfiguring packages ... 476s Fetched 10.7 MB in 2s (6558 kB/s) 476s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 106307 files and directories currently installed.) 476s Preparing to unpack .../gzip_1.12-1ubuntu3.1_amd64.deb ... 476s Unpacking gzip (1.12-1ubuntu3.1) over (1.12-1ubuntu3) ... 476s Setting up gzip (1.12-1ubuntu3.1) ... 476s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 106307 files and directories currently installed.) 476s Preparing to unpack .../0-libnss-systemd_255.4-1ubuntu8.10_amd64.deb ... 476s Unpacking libnss-systemd:amd64 (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 476s Preparing to unpack .../1-systemd-dev_255.4-1ubuntu8.10_all.deb ... 476s Unpacking systemd-dev (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 476s Preparing to unpack .../2-systemd-timesyncd_255.4-1ubuntu8.10_amd64.deb ... 476s Unpacking systemd-timesyncd (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 476s Preparing to unpack .../3-systemd-resolved_255.4-1ubuntu8.10_amd64.deb ... 476s Unpacking systemd-resolved (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 476s Preparing to unpack .../4-libsystemd-shared_255.4-1ubuntu8.10_amd64.deb ... 476s Unpacking libsystemd-shared:amd64 (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 476s Preparing to unpack .../5-libsystemd0_255.4-1ubuntu8.10_amd64.deb ... 476s Unpacking libsystemd0:amd64 (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 476s Setting up libsystemd0:amd64 (255.4-1ubuntu8.10) ... 476s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 106307 files and directories currently installed.) 476s Preparing to unpack .../systemd-sysv_255.4-1ubuntu8.10_amd64.deb ... 476s Unpacking systemd-sysv (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 476s Preparing to unpack .../libpam-systemd_255.4-1ubuntu8.10_amd64.deb ... 476s Unpacking libpam-systemd:amd64 (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 476s Preparing to unpack .../systemd_255.4-1ubuntu8.10_amd64.deb ... 476s Unpacking systemd (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 477s Preparing to unpack .../udev_255.4-1ubuntu8.10_amd64.deb ... 477s Unpacking udev (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 477s Preparing to unpack .../libudev1_255.4-1ubuntu8.10_amd64.deb ... 477s Unpacking libudev1:amd64 (255.4-1ubuntu8.10) over (255.4-1ubuntu8.8) ... 477s Setting up libudev1:amd64 (255.4-1ubuntu8.10) ... 477s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 106307 files and directories currently installed.) 477s Preparing to unpack .../0-openssh-sftp-server_1%3a9.6p1-3ubuntu13.13_amd64.deb ... 477s Unpacking openssh-sftp-server (1:9.6p1-3ubuntu13.13) over (1:9.6p1-3ubuntu13.12) ... 477s Preparing to unpack .../1-openssh-server_1%3a9.6p1-3ubuntu13.13_amd64.deb ... 477s Unpacking openssh-server (1:9.6p1-3ubuntu13.13) over (1:9.6p1-3ubuntu13.12) ... 477s Preparing to unpack .../2-openssh-client_1%3a9.6p1-3ubuntu13.13_amd64.deb ... 477s Unpacking openssh-client (1:9.6p1-3ubuntu13.13) over (1:9.6p1-3ubuntu13.12) ... 477s Preparing to unpack .../3-python3-netplan_1.1.2-2~ubuntu24.04.2_amd64.deb ... 477s Unpacking python3-netplan (1.1.2-2~ubuntu24.04.2) over (1.1.2-2~ubuntu24.04.1) ... 477s Preparing to unpack .../4-netplan-generator_1.1.2-2~ubuntu24.04.2_amd64.deb ... 477s Adding 'diversion of /lib/systemd/system-generators/netplan to /lib/systemd/system-generators/netplan.usr-is-merged by netplan-generator' 477s Unpacking netplan-generator (1.1.2-2~ubuntu24.04.2) over (1.1.2-2~ubuntu24.04.1) ... 477s Preparing to unpack .../5-netplan.io_1.1.2-2~ubuntu24.04.2_amd64.deb ... 477s Unpacking netplan.io (1.1.2-2~ubuntu24.04.2) over (1.1.2-2~ubuntu24.04.1) ... 477s Preparing to unpack .../6-libnetplan1_1.1.2-2~ubuntu24.04.2_amd64.deb ... 477s Unpacking libnetplan1:amd64 (1.1.2-2~ubuntu24.04.2) over (1.1.2-2~ubuntu24.04.1) ... 477s Setting up openssh-client (1:9.6p1-3ubuntu13.13) ... 477s Setting up systemd-dev (255.4-1ubuntu8.10) ... 477s Setting up libnetplan1:amd64 (1.1.2-2~ubuntu24.04.2) ... 477s Setting up libsystemd-shared:amd64 (255.4-1ubuntu8.10) ... 477s Setting up python3-netplan (1.1.2-2~ubuntu24.04.2) ... 477s Setting up openssh-sftp-server (1:9.6p1-3ubuntu13.13) ... 477s Setting up openssh-server (1:9.6p1-3ubuntu13.13) ... 478s Setting up systemd (255.4-1ubuntu8.10) ... 479s Setting up systemd-timesyncd (255.4-1ubuntu8.10) ... 479s Setting up udev (255.4-1ubuntu8.10) ... 480s Setting up netplan-generator (1.1.2-2~ubuntu24.04.2) ... 480s Removing 'diversion of /lib/systemd/system-generators/netplan to /lib/systemd/system-generators/netplan.usr-is-merged by netplan-generator' 480s Setting up systemd-resolved (255.4-1ubuntu8.10) ... 481s Setting up systemd-sysv (255.4-1ubuntu8.10) ... 481s Setting up libnss-systemd:amd64 (255.4-1ubuntu8.10) ... 481s Setting up netplan.io (1.1.2-2~ubuntu24.04.2) ... 481s Setting up libpam-systemd:amd64 (255.4-1ubuntu8.10) ... 481s Processing triggers for initramfs-tools (0.142ubuntu25.5) ... 481s update-initramfs: Generating /boot/initrd.img-6.8.0-63-generic 487s Processing triggers for libc-bin (2.39-0ubuntu8.4) ... 487s Processing triggers for ufw (0.36.2-6) ... 487s Processing triggers for man-db (2.12.0-4build2) ... 488s Processing triggers for dbus (1.14.10-4ubuntu4.1) ... 488s Processing triggers for install-info (7.1-3build2) ... 488s Reading package lists... 489s Building dependency tree... 489s Reading state information... 489s Starting pkgProblemResolver with broken count: 0 489s Starting 2 pkgProblemResolver with broken count: 0 489s Done 489s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 489s autopkgtest [02:19:57]: rebooting testbed after setup commands that affected boot 511s Reading package lists... 512s Building dependency tree... 512s Reading state information... 512s Starting pkgProblemResolver with broken count: 0 512s Starting 2 pkgProblemResolver with broken count: 0 512s Done 512s The following NEW packages will be installed: 512s libatomic1 libjemalloc2 liblzf1 redis-sentinel redis-server redis-tools 512s 0 upgraded, 6 newly installed, 0 to remove and 0 not upgraded. 512s Need to get 1504 kB of archives. 512s After this operation, 7690 kB of additional disk space will be used. 512s Get:1 http://ftpmaster.internal/ubuntu noble-updates/main amd64 libatomic1 amd64 14.2.0-4ubuntu2~24.04 [10.5 kB] 513s Get:2 http://ftpmaster.internal/ubuntu noble/universe amd64 libjemalloc2 amd64 5.3.0-2build1 [256 kB] 513s Get:3 http://ftpmaster.internal/ubuntu noble/universe amd64 liblzf1 amd64 3.6-4 [7624 B] 513s Get:4 http://ftpmaster.internal/ubuntu noble-updates/universe amd64 redis-tools amd64 5:7.0.15-1ubuntu0.24.04.1 [1166 kB] 513s Get:5 http://ftpmaster.internal/ubuntu noble-updates/universe amd64 redis-sentinel amd64 5:7.0.15-1ubuntu0.24.04.1 [12.3 kB] 513s Get:6 http://ftpmaster.internal/ubuntu noble-updates/universe amd64 redis-server amd64 5:7.0.15-1ubuntu0.24.04.1 [51.7 kB] 513s Fetched 1504 kB in 1s (2239 kB/s) 513s Selecting previously unselected package libatomic1:amd64. 513s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 106307 files and directories currently installed.) 513s Preparing to unpack .../0-libatomic1_14.2.0-4ubuntu2~24.04_amd64.deb ... 513s Unpacking libatomic1:amd64 (14.2.0-4ubuntu2~24.04) ... 513s Selecting previously unselected package libjemalloc2:amd64. 513s Preparing to unpack .../1-libjemalloc2_5.3.0-2build1_amd64.deb ... 513s Unpacking libjemalloc2:amd64 (5.3.0-2build1) ... 513s Selecting previously unselected package liblzf1:amd64. 513s Preparing to unpack .../2-liblzf1_3.6-4_amd64.deb ... 513s Unpacking liblzf1:amd64 (3.6-4) ... 514s Selecting previously unselected package redis-tools. 514s Preparing to unpack .../3-redis-tools_5%3a7.0.15-1ubuntu0.24.04.1_amd64.deb ... 514s Unpacking redis-tools (5:7.0.15-1ubuntu0.24.04.1) ... 514s Selecting previously unselected package redis-sentinel. 514s Preparing to unpack .../4-redis-sentinel_5%3a7.0.15-1ubuntu0.24.04.1_amd64.deb ... 514s Unpacking redis-sentinel (5:7.0.15-1ubuntu0.24.04.1) ... 514s Selecting previously unselected package redis-server. 514s Preparing to unpack .../5-redis-server_5%3a7.0.15-1ubuntu0.24.04.1_amd64.deb ... 514s Unpacking redis-server (5:7.0.15-1ubuntu0.24.04.1) ... 514s Setting up libjemalloc2:amd64 (5.3.0-2build1) ... 514s Setting up liblzf1:amd64 (3.6-4) ... 514s Setting up libatomic1:amd64 (14.2.0-4ubuntu2~24.04) ... 514s Setting up redis-tools (5:7.0.15-1ubuntu0.24.04.1) ... 514s Setting up redis-server (5:7.0.15-1ubuntu0.24.04.1) ... 514s Created symlink /etc/systemd/system/redis.service → /usr/lib/systemd/system/redis-server.service. 514s Created symlink /etc/systemd/system/multi-user.target.wants/redis-server.service → /usr/lib/systemd/system/redis-server.service. 514s Setting up redis-sentinel (5:7.0.15-1ubuntu0.24.04.1) ... 515s Created symlink /etc/systemd/system/sentinel.service → /usr/lib/systemd/system/redis-sentinel.service. 515s Created symlink /etc/systemd/system/multi-user.target.wants/redis-sentinel.service → /usr/lib/systemd/system/redis-sentinel.service. 515s Processing triggers for man-db (2.12.0-4build2) ... 516s Processing triggers for libc-bin (2.39-0ubuntu8.4) ... 520s autopkgtest [02:20:28]: test 0006-migrate-from-redis: [----------------------- 520s + FLAG_FILE=/etc/valkey/REDIS_MIGRATION 520s + sed -i 's#loglevel notice#loglevel debug#' /etc/redis/redis.conf 520s + systemctl restart redis-server 520s + redis-cli -h 127.0.0.1 -p 6379 SET test 1 520s OK 520s + redis-cli -h 127.0.0.1 -p 6379 GET test 520s 1 520s + redis-cli -h 127.0.0.1 -p 6379 SAVE 520s OK 520s + sha256sum /var/lib/redis/dump.rdb 520s eddfefaa223abf9411e6ceb50a8ca37bc42a11a9b49e4942c93b29f68d98a610 /var/lib/redis/dump.rdb 520s + apt-get install -y valkey-redis-compat 520s Reading package lists... 521s Building dependency tree... 521s Reading state information... 521s The following additional packages will be installed: 521s valkey-server valkey-tools 521s Suggested packages: 521s ruby-redis 521s The following packages will be REMOVED: 521s redis-sentinel redis-server redis-tools 521s The following NEW packages will be installed: 521s valkey-redis-compat valkey-server valkey-tools 521s 0 upgraded, 3 newly installed, 3 to remove and 0 not upgraded. 521s Need to get 1326 kB of archives. 521s After this operation, 348 kB of additional disk space will be used. 521s Get:1 http://ftpmaster.internal/ubuntu noble-updates/universe amd64 valkey-tools amd64 7.2.8+dfsg1-0ubuntu0.24.04.2 [1269 kB] 521s Get:2 http://ftpmaster.internal/ubuntu noble-updates/universe amd64 valkey-server amd64 7.2.8+dfsg1-0ubuntu0.24.04.2 [49.3 kB] 521s Get:3 http://ftpmaster.internal/ubuntu noble-updates/universe amd64 valkey-redis-compat all 7.2.8+dfsg1-0ubuntu0.24.04.2 [7748 B] 522s Fetched 1326 kB in 1s (2147 kB/s) 522s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 106366 files and directories currently installed.) 522s Removing redis-sentinel (5:7.0.15-1ubuntu0.24.04.1) ... 522s Removing redis-server (5:7.0.15-1ubuntu0.24.04.1) ... 523s Removing redis-tools (5:7.0.15-1ubuntu0.24.04.1) ... 523s Selecting previously unselected package valkey-tools. 523s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 106329 files and directories currently installed.) 523s Preparing to unpack .../valkey-tools_7.2.8+dfsg1-0ubuntu0.24.04.2_amd64.deb ... 523s Unpacking valkey-tools (7.2.8+dfsg1-0ubuntu0.24.04.2) ... 523s Selecting previously unselected package valkey-server. 523s Preparing to unpack .../valkey-server_7.2.8+dfsg1-0ubuntu0.24.04.2_amd64.deb ... 523s Unpacking valkey-server (7.2.8+dfsg1-0ubuntu0.24.04.2) ... 523s Selecting previously unselected package valkey-redis-compat. 523s Preparing to unpack .../valkey-redis-compat_7.2.8+dfsg1-0ubuntu0.24.04.2_all.deb ... 523s Unpacking valkey-redis-compat (7.2.8+dfsg1-0ubuntu0.24.04.2) ... 523s Setting up valkey-tools (7.2.8+dfsg1-0ubuntu0.24.04.2) ... 523s Setting up valkey-server (7.2.8+dfsg1-0ubuntu0.24.04.2) ... 523s Created symlink /etc/systemd/system/valkey.service → /usr/lib/systemd/system/valkey-server.service. 523s Created symlink /etc/systemd/system/multi-user.target.wants/valkey-server.service → /usr/lib/systemd/system/valkey-server.service. 524s Setting up valkey-redis-compat (7.2.8+dfsg1-0ubuntu0.24.04.2) ... 524s dpkg-query: no packages found matching valkey-sentinel 524s [I] /etc/redis/redis.conf has been copied to /etc/valkey/valkey.conf. Please, review the content of valkey.conf, especially if you had modified redis.conf. 524s [I] /etc/redis/sentinel.conf has been copied to /etc/valkey/sentinel.conf. Please, review the content of sentinel.conf, especially if you had modified sentinel.conf. 524s [I] On-disk redis dumps moved from /var/lib/redis/ to /var/lib/valkey. 524s Processing triggers for man-db (2.12.0-4build2) ... 524s + '[' -f /etc/valkey/REDIS_MIGRATION ']' 524s + sha256sum /var/lib/valkey/dump.rdb 524s 68d303e7c782c16357a1f488de66f8e354a7cdff00d583c01975343f2731510f /var/lib/valkey/dump.rdb 524s + systemctl status valkey-server 524s + grep inactive 524s Active: inactive (dead) since Fri 2025-07-04 02:20:31 UTC; 512ms ago 524s + rm /etc/valkey/REDIS_MIGRATION 524s + systemctl start valkey-server 524s + systemctl status valkey-server 524s + grep running 524s Active: active (running) since Fri 2025-07-04 02:20:32 UTC; 7ms ago 524s + sha256sum /var/lib/valkey/dump.rdb 524s 68d303e7c782c16357a1f488de66f8e354a7cdff00d583c01975343f2731510f /var/lib/valkey/dump.rdb 524s + cat /etc/valkey/valkey.conf 524s + grep loglevel 524s + grep debug 524s loglevel debug 524s + valkey-cli -h 127.0.0.1 -p 6379 GET test 524s + grep 1 524s 1 525s autopkgtest [02:20:33]: test 0006-migrate-from-redis: -----------------------] 525s 0006-migrate-from-redis PASS 525s autopkgtest [02:20:33]: test 0006-migrate-from-redis: - - - - - - - - - - results - - - - - - - - - - 526s autopkgtest [02:20:34]: @@@@@@@@@@@@@@@@@@@@ summary 526s 0001-valkey-cli PASS 526s 0002-benchmark PASS 526s 0003-valkey-check-aof PASS 526s 0004-valkey-check-rdb PASS 526s 0005-cjson PASS 526s 0006-migrate-from-redis PASS