0s autopkgtest [03:02:15]: starting date and time: 2025-02-28 03:02:15+0000 0s autopkgtest [03:02:15]: git checkout: 325255d2 Merge branch 'pin-any-arch' into 'ubuntu/production' 0s autopkgtest [03:02:15]: host juju-7f2275-prod-proposed-migration-environment-2; command line: /home/ubuntu/autopkgtest/runner/autopkgtest --output-dir /tmp/autopkgtest-work.5p1y8a0_/out --timeout-copy=6000 --setup-commands /home/ubuntu/autopkgtest-cloud/worker-config-production/setup-canonical.sh --apt-pocket=proposed=src:systemd,src:dpdk,src:samba --apt-upgrade valkey --timeout-short=300 --timeout-copy=20000 --timeout-build=20000 '--env=ADT_TEST_TRIGGERS=systemd/255.4-1ubuntu8.6 dpdk/23.11.2-0ubuntu0.24.04.1 samba/2:4.19.5+dfsg-4ubuntu9.1' -- ssh -s /home/ubuntu/autopkgtest/ssh-setup/nova -- --flavor autopkgtest-ppc64el --security-groups autopkgtest-juju-7f2275-prod-proposed-migration-environment-2@bos03-ppc64el-1.secgroup --name adt-noble-ppc64el-valkey-20250228-030214-juju-7f2275-prod-proposed-migration-environment-2-ef545cde-46d0-42d7-8d0f-255c573bf3db --image adt/ubuntu-noble-ppc64el-server --keyname testbed-juju-7f2275-prod-proposed-migration-environment-2 --net-id=net_prod-proposed-migration-ppc64el -e TERM=linux -e ''"'"'http_proxy=http://squid.internal:3128'"'"'' -e ''"'"'https_proxy=http://squid.internal:3128'"'"'' -e ''"'"'no_proxy=127.0.0.1,127.0.1.1,login.ubuntu.com,localhost,localdomain,novalocal,internal,archive.ubuntu.com,ports.ubuntu.com,security.ubuntu.com,ddebs.ubuntu.com,changelogs.ubuntu.com,keyserver.ubuntu.com,launchpadlibrarian.net,launchpadcontent.net,launchpad.net,10.24.0.0/24,keystone.ps5.canonical.com,objectstorage.prodstack5.canonical.com,radosgw.ps5.canonical.com'"'"'' --mirror=http://ftpmaster.internal/ubuntu/ 99s autopkgtest [03:03:54]: testbed dpkg architecture: ppc64el 99s autopkgtest [03:03:54]: testbed apt version: 2.7.14build2 99s autopkgtest [03:03:54]: @@@@@@@@@@@@@@@@@@@@ test bed setup 100s autopkgtest [03:03:55]: testbed release detected to be: None 101s autopkgtest [03:03:56]: updating testbed package index (apt update) 101s Get:1 http://ftpmaster.internal/ubuntu noble-proposed InRelease [265 kB] 101s Hit:2 http://ftpmaster.internal/ubuntu noble InRelease 101s Hit:3 http://ftpmaster.internal/ubuntu noble-updates InRelease 101s Hit:4 http://ftpmaster.internal/ubuntu noble-security InRelease 102s Get:5 http://ftpmaster.internal/ubuntu noble-proposed/universe Sources [66.2 kB] 102s Get:6 http://ftpmaster.internal/ubuntu noble-proposed/restricted Sources [18.6 kB] 102s Get:7 http://ftpmaster.internal/ubuntu noble-proposed/main Sources [61.6 kB] 102s Get:8 http://ftpmaster.internal/ubuntu noble-proposed/multiverse Sources [9488 B] 102s Get:9 http://ftpmaster.internal/ubuntu noble-proposed/main ppc64el Packages [88.2 kB] 102s Get:10 http://ftpmaster.internal/ubuntu noble-proposed/main ppc64el c-n-f Metadata [3752 B] 102s Get:11 http://ftpmaster.internal/ubuntu noble-proposed/restricted ppc64el Packages [1380 B] 102s Get:12 http://ftpmaster.internal/ubuntu noble-proposed/restricted ppc64el c-n-f Metadata [116 B] 102s Get:13 http://ftpmaster.internal/ubuntu noble-proposed/universe ppc64el Packages [416 kB] 102s Get:14 http://ftpmaster.internal/ubuntu noble-proposed/universe ppc64el c-n-f Metadata [9704 B] 102s Get:15 http://ftpmaster.internal/ubuntu noble-proposed/multiverse ppc64el Packages [968 B] 102s Get:16 http://ftpmaster.internal/ubuntu noble-proposed/multiverse ppc64el c-n-f Metadata [116 B] 107s Fetched 941 kB in 1s (940 kB/s) 108s Reading package lists... 109s Reading package lists... 109s Building dependency tree... 109s Reading state information... 110s Calculating upgrade... 110s The following packages will be upgraded: 110s cloud-init cryptsetup-bin libcryptsetup12 110s 3 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 110s Need to get 1206 kB of archives. 110s After this operation, 13.3 kB of additional disk space will be used. 110s Get:1 http://ftpmaster.internal/ubuntu noble-proposed/main ppc64el libcryptsetup12 ppc64el 2:2.7.0-1ubuntu4.2 [375 kB] 110s Get:2 http://ftpmaster.internal/ubuntu noble-proposed/main ppc64el cryptsetup-bin ppc64el 2:2.7.0-1ubuntu4.2 [227 kB] 110s Get:3 http://ftpmaster.internal/ubuntu noble-proposed/main ppc64el cloud-init all 24.4.1-0ubuntu0~24.04.1 [604 kB] 111s Preconfiguring packages ... 111s Fetched 1206 kB in 1s (1426 kB/s) 111s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 72680 files and directories currently installed.) 111s Preparing to unpack .../libcryptsetup12_2%3a2.7.0-1ubuntu4.2_ppc64el.deb ... 111s Unpacking libcryptsetup12:ppc64el (2:2.7.0-1ubuntu4.2) over (2:2.7.0-1ubuntu4.1) ... 111s Preparing to unpack .../cryptsetup-bin_2%3a2.7.0-1ubuntu4.2_ppc64el.deb ... 111s Unpacking cryptsetup-bin (2:2.7.0-1ubuntu4.2) over (2:2.7.0-1ubuntu4.1) ... 111s Preparing to unpack .../cloud-init_24.4.1-0ubuntu0~24.04.1_all.deb ... 111s Unpacking cloud-init (24.4.1-0ubuntu0~24.04.1) over (24.4-0ubuntu1~24.04.2) ... 112s Setting up cloud-init (24.4.1-0ubuntu0~24.04.1) ... 114s Setting up libcryptsetup12:ppc64el (2:2.7.0-1ubuntu4.2) ... 114s Setting up cryptsetup-bin (2:2.7.0-1ubuntu4.2) ... 114s Processing triggers for rsyslog (8.2312.0-3ubuntu9) ... 114s Processing triggers for man-db (2.12.0-4build2) ... 115s Processing triggers for libc-bin (2.39-0ubuntu8.4) ... 115s Reading package lists... 115s Building dependency tree... 115s Reading state information... 116s 0 upgraded, 0 newly installed, 0 to remove and 11 not upgraded. 116s autopkgtest [03:04:11]: upgrading testbed (apt dist-upgrade and autopurge) 116s Reading package lists... 117s Building dependency tree... 117s Reading state information... 117s Calculating upgrade...Starting pkgProblemResolver with broken count: 0 117s Starting 2 pkgProblemResolver with broken count: 0 117s Done 118s Entering ResolveByKeep 119s 119s The following packages will be upgraded: 119s libnss-systemd libpam-systemd libsystemd-shared libsystemd0 libudev1 systemd 119s systemd-dev systemd-resolved systemd-sysv systemd-timesyncd udev 119s 11 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 119s Need to get 9885 kB of archives. 119s After this operation, 0 B of additional disk space will be used. 119s Get:1 http://ftpmaster.internal/ubuntu noble-proposed/main ppc64el libnss-systemd ppc64el 255.4-1ubuntu8.6 [207 kB] 119s Get:2 http://ftpmaster.internal/ubuntu noble-proposed/main ppc64el systemd-dev all 255.4-1ubuntu8.6 [104 kB] 119s Get:3 http://ftpmaster.internal/ubuntu noble-proposed/main ppc64el systemd-timesyncd ppc64el 255.4-1ubuntu8.6 [37.6 kB] 119s Get:4 http://ftpmaster.internal/ubuntu noble-proposed/main ppc64el systemd-resolved ppc64el 255.4-1ubuntu8.6 [345 kB] 120s Get:5 http://ftpmaster.internal/ubuntu noble-proposed/main ppc64el libsystemd-shared ppc64el 255.4-1ubuntu8.6 [2346 kB] 120s Get:6 http://ftpmaster.internal/ubuntu noble-proposed/main ppc64el libsystemd0 ppc64el 255.4-1ubuntu8.6 [526 kB] 120s Get:7 http://ftpmaster.internal/ubuntu noble-proposed/main ppc64el systemd-sysv ppc64el 255.4-1ubuntu8.6 [11.9 kB] 120s Get:8 http://ftpmaster.internal/ubuntu noble-proposed/main ppc64el libpam-systemd ppc64el 255.4-1ubuntu8.6 [303 kB] 120s Get:9 http://ftpmaster.internal/ubuntu noble-proposed/main ppc64el systemd ppc64el 255.4-1ubuntu8.6 [3767 kB] 121s Get:10 http://ftpmaster.internal/ubuntu noble-proposed/main ppc64el udev ppc64el 255.4-1ubuntu8.6 [2036 kB] 121s Get:11 http://ftpmaster.internal/ubuntu noble-proposed/main ppc64el libudev1 ppc64el 255.4-1ubuntu8.6 [200 kB] 122s Fetched 9885 kB in 2s (4344 kB/s) 122s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 72680 files and directories currently installed.) 122s Preparing to unpack .../0-libnss-systemd_255.4-1ubuntu8.6_ppc64el.deb ... 122s Unpacking libnss-systemd:ppc64el (255.4-1ubuntu8.6) over (255.4-1ubuntu8.5) ... 122s Preparing to unpack .../1-systemd-dev_255.4-1ubuntu8.6_all.deb ... 122s Unpacking systemd-dev (255.4-1ubuntu8.6) over (255.4-1ubuntu8.5) ... 122s Preparing to unpack .../2-systemd-timesyncd_255.4-1ubuntu8.6_ppc64el.deb ... 122s Unpacking systemd-timesyncd (255.4-1ubuntu8.6) over (255.4-1ubuntu8.5) ... 122s Preparing to unpack .../3-systemd-resolved_255.4-1ubuntu8.6_ppc64el.deb ... 122s Unpacking systemd-resolved (255.4-1ubuntu8.6) over (255.4-1ubuntu8.5) ... 122s Preparing to unpack .../4-libsystemd-shared_255.4-1ubuntu8.6_ppc64el.deb ... 122s Unpacking libsystemd-shared:ppc64el (255.4-1ubuntu8.6) over (255.4-1ubuntu8.5) ... 122s Preparing to unpack .../5-libsystemd0_255.4-1ubuntu8.6_ppc64el.deb ... 122s Unpacking libsystemd0:ppc64el (255.4-1ubuntu8.6) over (255.4-1ubuntu8.5) ... 122s Setting up libsystemd0:ppc64el (255.4-1ubuntu8.6) ... 122s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 72680 files and directories currently installed.) 122s Preparing to unpack .../systemd-sysv_255.4-1ubuntu8.6_ppc64el.deb ... 122s Unpacking systemd-sysv (255.4-1ubuntu8.6) over (255.4-1ubuntu8.5) ... 122s Preparing to unpack .../libpam-systemd_255.4-1ubuntu8.6_ppc64el.deb ... 122s Unpacking libpam-systemd:ppc64el (255.4-1ubuntu8.6) over (255.4-1ubuntu8.5) ... 122s Preparing to unpack .../systemd_255.4-1ubuntu8.6_ppc64el.deb ... 122s Unpacking systemd (255.4-1ubuntu8.6) over (255.4-1ubuntu8.5) ... 122s Preparing to unpack .../udev_255.4-1ubuntu8.6_ppc64el.deb ... 122s Unpacking udev (255.4-1ubuntu8.6) over (255.4-1ubuntu8.5) ... 123s Preparing to unpack .../libudev1_255.4-1ubuntu8.6_ppc64el.deb ... 123s Unpacking libudev1:ppc64el (255.4-1ubuntu8.6) over (255.4-1ubuntu8.5) ... 123s Setting up libudev1:ppc64el (255.4-1ubuntu8.6) ... 123s Setting up systemd-dev (255.4-1ubuntu8.6) ... 123s Setting up libsystemd-shared:ppc64el (255.4-1ubuntu8.6) ... 123s Setting up systemd (255.4-1ubuntu8.6) ... 123s Setting up systemd-timesyncd (255.4-1ubuntu8.6) ... 124s Setting up udev (255.4-1ubuntu8.6) ... 125s Setting up systemd-resolved (255.4-1ubuntu8.6) ... 125s Setting up systemd-sysv (255.4-1ubuntu8.6) ... 125s Setting up libnss-systemd:ppc64el (255.4-1ubuntu8.6) ... 125s Setting up libpam-systemd:ppc64el (255.4-1ubuntu8.6) ... 126s Processing triggers for libc-bin (2.39-0ubuntu8.4) ... 126s Processing triggers for man-db (2.12.0-4build2) ... 126s Processing triggers for dbus (1.14.10-4ubuntu4.1) ... 126s Processing triggers for initramfs-tools (0.142ubuntu25.5) ... 127s update-initramfs: Generating /boot/initrd.img-6.8.0-54-generic 127s W: No lz4 in /usr/bin:/sbin:/bin, using gzip 135s Reading package lists... 135s Building dependency tree... 135s Reading state information... 135s Starting pkgProblemResolver with broken count: 0 136s Starting 2 pkgProblemResolver with broken count: 0 136s Done 136s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 136s autopkgtest [03:04:31]: rebooting testbed after setup commands that affected boot 140s autopkgtest-virt-ssh: WARNING: ssh connection failed. Retrying in 3 seconds... 172s autopkgtest [03:05:07]: testbed running kernel: Linux 6.8.0-54-generic #56-Ubuntu SMP Sat Feb 8 15:16:39 UTC 2025 175s autopkgtest [03:05:10]: @@@@@@@@@@@@@@@@@@@@ apt-source valkey 179s Get:1 http://ftpmaster.internal/ubuntu noble-updates/universe valkey 7.2.7+dfsg1-0ubuntu0.24.04.1 (dsc) [2515 B] 179s Get:2 http://ftpmaster.internal/ubuntu noble-updates/universe valkey 7.2.7+dfsg1-0ubuntu0.24.04.1 (tar) [2469 kB] 179s Get:3 http://ftpmaster.internal/ubuntu noble-updates/universe valkey 7.2.7+dfsg1-0ubuntu0.24.04.1 (diff) [18.0 kB] 180s gpgv: Signature made Fri Dec 13 23:31:20 2024 UTC 180s gpgv: using RSA key 63EEFC3DE14D5146CE7F24BF34B8AD7D9529E793 180s gpgv: issuer "lena.voytek@canonical.com" 180s gpgv: Can't check signature: No public key 180s dpkg-source: warning: cannot verify inline signature for ./valkey_7.2.7+dfsg1-0ubuntu0.24.04.1.dsc: no acceptable signature found 180s autopkgtest [03:05:15]: testing package valkey version 7.2.7+dfsg1-0ubuntu0.24.04.1 181s autopkgtest [03:05:16]: build not needed 186s autopkgtest [03:05:21]: test 0001-valkey-cli: preparing testbed 186s Reading package lists... 187s Building dependency tree... 187s Reading state information... 187s Starting pkgProblemResolver with broken count: 0 187s Starting 2 pkgProblemResolver with broken count: 0 187s Done 187s The following NEW packages will be installed: 187s libatomic1 libjemalloc2 liblzf1 valkey-server valkey-tools 187s 0 upgraded, 5 newly installed, 0 to remove and 0 not upgraded. 187s Need to get 1875 kB of archives. 187s After this operation, 10.4 MB of additional disk space will be used. 187s Get:1 http://ftpmaster.internal/ubuntu noble-updates/main ppc64el libatomic1 ppc64el 14.2.0-4ubuntu2~24.04 [10.8 kB] 187s Get:2 http://ftpmaster.internal/ubuntu noble/universe ppc64el libjemalloc2 ppc64el 5.3.0-2build1 [259 kB] 188s Get:3 http://ftpmaster.internal/ubuntu noble/universe ppc64el liblzf1 ppc64el 3.6-4 [7920 B] 188s Get:4 http://ftpmaster.internal/ubuntu noble-updates/universe ppc64el valkey-tools ppc64el 7.2.7+dfsg1-0ubuntu0.24.04.1 [1548 kB] 188s Get:5 http://ftpmaster.internal/ubuntu noble-updates/universe ppc64el valkey-server ppc64el 7.2.7+dfsg1-0ubuntu0.24.04.1 [49.2 kB] 188s Fetched 1875 kB in 1s (2421 kB/s) 188s Selecting previously unselected package libatomic1:ppc64el. 188s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 72680 files and directories currently installed.) 188s Preparing to unpack .../libatomic1_14.2.0-4ubuntu2~24.04_ppc64el.deb ... 188s Unpacking libatomic1:ppc64el (14.2.0-4ubuntu2~24.04) ... 188s Selecting previously unselected package libjemalloc2:ppc64el. 188s Preparing to unpack .../libjemalloc2_5.3.0-2build1_ppc64el.deb ... 188s Unpacking libjemalloc2:ppc64el (5.3.0-2build1) ... 188s Selecting previously unselected package liblzf1:ppc64el. 188s Preparing to unpack .../liblzf1_3.6-4_ppc64el.deb ... 188s Unpacking liblzf1:ppc64el (3.6-4) ... 188s Selecting previously unselected package valkey-tools. 188s Preparing to unpack .../valkey-tools_7.2.7+dfsg1-0ubuntu0.24.04.1_ppc64el.deb ... 188s Unpacking valkey-tools (7.2.7+dfsg1-0ubuntu0.24.04.1) ... 189s Selecting previously unselected package valkey-server. 189s Preparing to unpack .../valkey-server_7.2.7+dfsg1-0ubuntu0.24.04.1_ppc64el.deb ... 189s Unpacking valkey-server (7.2.7+dfsg1-0ubuntu0.24.04.1) ... 189s Setting up libjemalloc2:ppc64el (5.3.0-2build1) ... 189s Setting up liblzf1:ppc64el (3.6-4) ... 189s Setting up libatomic1:ppc64el (14.2.0-4ubuntu2~24.04) ... 189s Setting up valkey-tools (7.2.7+dfsg1-0ubuntu0.24.04.1) ... 189s Setting up valkey-server (7.2.7+dfsg1-0ubuntu0.24.04.1) ... 189s Created symlink /etc/systemd/system/valkey.service → /usr/lib/systemd/system/valkey-server.service. 189s Created symlink /etc/systemd/system/multi-user.target.wants/valkey-server.service → /usr/lib/systemd/system/valkey-server.service. 190s Processing triggers for man-db (2.12.0-4build2) ... 190s Processing triggers for libc-bin (2.39-0ubuntu8.4) ... 192s autopkgtest [03:05:27]: test 0001-valkey-cli: [----------------------- 197s # Server 197s redis_version:7.2.4 197s server_name:valkey 197s valkey_version:7.2.7 197s redis_git_sha1:00000000 197s redis_git_dirty:0 197s redis_build_id:ce2b5437e981d4e7 197s redis_mode:standalone 197s os:Linux 6.8.0-54-generic ppc64le 197s arch_bits:64 197s monotonic_clock:POSIX clock_gettime 197s multiplexing_api:epoll 197s atomicvar_api:c11-builtin 197s gcc_version:13.3.0 197s process_id:1645 197s process_supervised:systemd 197s run_id:b33046a4e6e8f074e924de4bc59e2a74538a93b1 197s tcp_port:6379 197s server_time_usec:1740711932317095 197s uptime_in_seconds:5 197s uptime_in_days:0 197s hz:10 197s configured_hz:10 197s lru_clock:12658684 197s executable:/usr/bin/valkey-server 197s config_file:/etc/valkey/valkey.conf 197s io_threads_active:0 197s listener0:name=tcp,bind=127.0.0.1,bind=-::1,port=6379 197s 197s # Clients 197s connected_clients:1 197s cluster_connections:0 197s maxclients:10000 197s client_recent_max_input_buffer:0 197s client_recent_max_output_buffer:0 197s blocked_clients:0 197s tracking_clients:0 197s clients_in_timeout_table:0 197s total_blocking_keys:0 197s total_blocking_keys_on_nokey:0 197s 197s # Memory 197s used_memory:971512 197s used_memory_human:948.74K 197s used_memory_rss:26542080 197s used_memory_rss_human:25.31M 197s used_memory_peak:971512 197s used_memory_peak_human:948.74K 197s used_memory_peak_perc:102.45% 197s used_memory_overhead:930544 197s used_memory_startup:930344 197s used_memory_dataset:40968 197s used_memory_dataset_perc:99.51% 197s allocator_allocated:4558688 197s allocator_active:9568256 197s allocator_resident:11927552 197s total_system_memory:4207935488 197s total_system_memory_human:3.92G 197s used_memory_lua:31744 197s used_memory_vm_eval:31744 197s used_memory_lua_human:31.00K 197s used_memory_scripts_eval:0 197s number_of_cached_scripts:0 197s number_of_functions:0 197s number_of_libraries:0 197s used_memory_vm_functions:32768 197s used_memory_vm_total:64512 197s used_memory_vm_total_human:63.00K 197s used_memory_functions:200 197s used_memory_scripts:200 197s used_memory_scripts_human:200B 197s maxmemory:0 197s maxmemory_human:0B 197s maxmemory_policy:noeviction 197s allocator_frag_ratio:2.10 197s allocator_frag_bytes:5009568 197s allocator_rss_ratio:1.25 197s allocator_rss_bytes:2359296 197s rss_overhead_ratio:2.23 197s rss_overhead_bytes:14614528 197s mem_fragmentation_ratio:28.52 197s mem_fragmentation_bytes:25611592 197s mem_not_counted_for_evict:0 197s mem_replication_backlog:0 197s mem_total_replication_buffers:0 197s mem_clients_slaves:0 197s mem_clients_normal:0 197s mem_cluster_links:0 197s mem_aof_buffer:0 197s mem_allocator:jemalloc-5.3.0 197s active_defrag_running:0 197s lazyfree_pending_objects:0 197s lazyfreed_objects:0 197s 197s # Persistence 197s loading:0 197s async_loading:0 197s current_cow_peak:0 197s current_cow_size:0 197s current_cow_size_age:0 197s current_fork_perc:0.00 197s current_save_keys_processed:0 197s current_save_keys_total:0 197s rdb_changes_since_last_save:0 197s rdb_bgsave_in_progress:0 197s rdb_last_save_time:1740711927 197s rdb_last_bgsave_status:ok 197s rdb_last_bgsave_time_sec:-1 197s rdb_current_bgsave_time_sec:-1 197s rdb_saves:0 197s rdb_last_cow_size:0 197s rdb_last_load_keys_expired:0 197s rdb_last_load_keys_loaded:0 197s aof_enabled:0 197s aof_rewrite_in_progress:0 197s aof_rewrite_scheduled:0 197s aof_last_rewrite_time_sec:-1 197s aof_current_rewrite_time_sec:-1 197s aof_last_bgrewrite_status:ok 197s aof_rewrites:0 197s aof_rewrites_consecutive_failures:0 197s aof_last_write_status:ok 197s aof_last_cow_size:0 197s module_fork_in_progress:0 197s module_fork_last_cow_size:0 197s 197s # Stats 197s total_connections_received:1 197s total_commands_processed:0 197s instantaneous_ops_per_sec:0 197s total_net_input_bytes:14 197s total_net_output_bytes:0 197s total_net_repl_input_bytes:0 197s total_net_repl_output_bytes:0 197s instantaneous_input_kbps:0.00 197s instantaneous_output_kbps:0.00 197s instantaneous_input_repl_kbps:0.00 197s instantaneous_output_repl_kbps:0.00 197s rejected_connections:0 197s sync_full:0 197s sync_partial_ok:0 197s sync_partial_err:0 197s expired_keys:0 197s expired_stale_perc:0.00 197s expired_time_cap_reached_count:0 197s expire_cycle_cpu_milliseconds:0 197s evicted_keys:0 197s evicted_clients:0 197s total_eviction_exceeded_time:0 197s current_eviction_exceeded_time:0 197s keyspace_hits:0 197s keyspace_misses:0 197s pubsub_channels:0 197s pubsub_patterns:0 197s pubsubshard_channels:0 197s latest_fork_usec:0 197s total_forks:0 197s migrate_cached_sockets:0 197s slave_expires_tracked_keys:0 197s active_defrag_hits:0 197s active_defrag_misses:0 197s active_defrag_key_hits:0 197s active_defrag_key_misses:0 197s total_active_defrag_time:0 197s current_active_defrag_time:0 197s tracking_total_keys:0 197s tracking_total_items:0 197s tracking_total_prefixes:0 197s unexpected_error_replies:0 197s total_error_replies:0 197s dump_payload_sanitizations:0 197s total_reads_processed:1 197s total_writes_processed:0 197s io_threaded_reads_processed:0 197s io_threaded_writes_processed:0 197s reply_buffer_shrinks:0 197s reply_buffer_expands:0 197s eventloop_cycles:51 197s eventloop_duration_sum:9657 197s eventloop_duration_cmd_sum:0 197s instantaneous_eventloop_cycles_per_sec:9 197s instantaneous_eventloop_duration_usec:187 197s acl_access_denied_auth:0 197s acl_access_denied_cmd:0 197s acl_access_denied_key:0 197s acl_access_denied_channel:0 197s 197s # Replication 197s role:master 197s connected_slaves:0 197s master_failover_state:no-failover 197s master_replid:39fcea3125985bb56305add21c27f4ae4580e0f6 197s master_replid2:0000000000000000000000000000000000000000 197s master_repl_offset:0 197s second_repl_offset:-1 197s repl_backlog_active:0 197s repl_backlog_size:1048576 197s repl_backlog_first_byte_offset:0 197s repl_backlog_histlen:0 197s 197s # CPU 197s used_cpu_sys:0.044060 197s used_cpu_user:0.059798 197s used_cpu_sys_children:0.000000 197s used_cpu_user_children:0.000660 197s used_cpu_sys_main_thread:0.045976 197s used_cpu_user_main_thread:0.057471 197s 197s # Modules 197s 197s # Errorstats 197s 197s # Cluster 197s cluster_enabled:0 197s 197s # Keyspace 197s Redis ver. 7.2.7 197s autopkgtest [03:05:32]: test 0001-valkey-cli: -----------------------] 198s 0001-valkey-cli PASS 198s autopkgtest [03:05:33]: test 0001-valkey-cli: - - - - - - - - - - results - - - - - - - - - - 198s autopkgtest [03:05:33]: test 0002-benchmark: preparing testbed 199s Reading package lists... 199s Building dependency tree... 199s Reading state information... 199s Starting pkgProblemResolver with broken count: 0 199s Starting 2 pkgProblemResolver with broken count: 0 199s Done 199s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 200s autopkgtest [03:05:35]: test 0002-benchmark: [----------------------- 206s PING_INLINE: rps=0.0 (overall: 0.0) avg_msec=nan (overall: nan) ====== PING_INLINE ====== 206s 100000 requests completed in 0.16 seconds 206s 50 parallel clients 206s 3 bytes payload 206s keep alive: 1 206s host configuration "save": 3600 1 300 100 60 10000 206s host configuration "appendonly": no 206s multi-thread: no 206s 206s Latency by percentile distribution: 206s 0.000% <= 0.287 milliseconds (cumulative count 20) 206s 50.000% <= 0.647 milliseconds (cumulative count 52010) 206s 75.000% <= 0.767 milliseconds (cumulative count 75610) 206s 87.500% <= 0.871 milliseconds (cumulative count 88260) 206s 93.750% <= 0.935 milliseconds (cumulative count 93760) 206s 96.875% <= 1.007 milliseconds (cumulative count 97000) 206s 98.438% <= 1.071 milliseconds (cumulative count 98500) 206s 99.219% <= 1.159 milliseconds (cumulative count 99220) 206s 99.609% <= 1.815 milliseconds (cumulative count 99610) 206s 99.805% <= 2.223 milliseconds (cumulative count 99810) 206s 99.902% <= 2.463 milliseconds (cumulative count 99910) 206s 99.951% <= 2.567 milliseconds (cumulative count 99960) 206s 99.976% <= 2.607 milliseconds (cumulative count 99980) 206s 99.988% <= 2.647 milliseconds (cumulative count 99990) 206s 99.994% <= 2.663 milliseconds (cumulative count 100000) 206s 100.000% <= 2.663 milliseconds (cumulative count 100000) 206s 206s Cumulative distribution of latencies: 206s 0.000% <= 0.103 milliseconds (cumulative count 0) 206s 0.060% <= 0.303 milliseconds (cumulative count 60) 206s 4.150% <= 0.407 milliseconds (cumulative count 4150) 206s 14.810% <= 0.503 milliseconds (cumulative count 14810) 206s 40.180% <= 0.607 milliseconds (cumulative count 40180) 206s 64.750% <= 0.703 milliseconds (cumulative count 64750) 206s 80.810% <= 0.807 milliseconds (cumulative count 80810) 206s 91.600% <= 0.903 milliseconds (cumulative count 91600) 206s 97.000% <= 1.007 milliseconds (cumulative count 97000) 206s 98.880% <= 1.103 milliseconds (cumulative count 98880) 206s 99.330% <= 1.207 milliseconds (cumulative count 99330) 206s 99.460% <= 1.303 milliseconds (cumulative count 99460) 206s 99.490% <= 1.407 milliseconds (cumulative count 99490) 206s 99.520% <= 1.607 milliseconds (cumulative count 99520) 206s 99.570% <= 1.703 milliseconds (cumulative count 99570) 206s 99.600% <= 1.807 milliseconds (cumulative count 99600) 206s 99.660% <= 1.903 milliseconds (cumulative count 99660) 206s 99.710% <= 2.007 milliseconds (cumulative count 99710) 206s 99.760% <= 2.103 milliseconds (cumulative count 99760) 206s 100.000% <= 3.103 milliseconds (cumulative count 100000) 206s 206s Summary: 206s throughput summary: 621118.00 requests per second 206s latency summary (msec): 206s avg min p50 p95 p99 max 206s 0.669 0.280 0.647 0.967 1.127 2.663 206s PING_MBULK: rps=212760.0 (overall: 604431.8) avg_msec=0.476 (overall: 0.476) ====== PING_MBULK ====== 206s 100000 requests completed in 0.16 seconds 206s 50 parallel clients 206s 3 bytes payload 206s keep alive: 1 206s host configuration "save": 3600 1 300 100 60 10000 206s host configuration "appendonly": no 206s multi-thread: no 206s 206s Latency by percentile distribution: 206s 0.000% <= 0.167 milliseconds (cumulative count 20) 206s 50.000% <= 0.391 milliseconds (cumulative count 51910) 206s 75.000% <= 0.447 milliseconds (cumulative count 75630) 206s 87.500% <= 0.575 milliseconds (cumulative count 87720) 206s 93.750% <= 0.663 milliseconds (cumulative count 94230) 206s 96.875% <= 0.743 milliseconds (cumulative count 96880) 206s 98.438% <= 0.911 milliseconds (cumulative count 98480) 206s 99.219% <= 1.039 milliseconds (cumulative count 99250) 206s 99.609% <= 1.119 milliseconds (cumulative count 99640) 206s 99.805% <= 1.263 milliseconds (cumulative count 99810) 206s 99.902% <= 1.415 milliseconds (cumulative count 99910) 206s 99.951% <= 1.487 milliseconds (cumulative count 99960) 206s 99.976% <= 1.519 milliseconds (cumulative count 99980) 206s 99.988% <= 1.543 milliseconds (cumulative count 99990) 206s 99.994% <= 1.567 milliseconds (cumulative count 100000) 206s 100.000% <= 1.567 milliseconds (cumulative count 100000) 206s 206s Cumulative distribution of latencies: 206s 0.000% <= 0.103 milliseconds (cumulative count 0) 206s 0.050% <= 0.207 milliseconds (cumulative count 50) 206s 0.250% <= 0.303 milliseconds (cumulative count 250) 206s 64.040% <= 0.407 milliseconds (cumulative count 64040) 206s 80.970% <= 0.503 milliseconds (cumulative count 80970) 206s 90.580% <= 0.607 milliseconds (cumulative count 90580) 206s 96.070% <= 0.703 milliseconds (cumulative count 96070) 206s 97.590% <= 0.807 milliseconds (cumulative count 97590) 206s 98.390% <= 0.903 milliseconds (cumulative count 98390) 206s 99.080% <= 1.007 milliseconds (cumulative count 99080) 206s 99.590% <= 1.103 milliseconds (cumulative count 99590) 206s 99.770% <= 1.207 milliseconds (cumulative count 99770) 206s 99.840% <= 1.303 milliseconds (cumulative count 99840) 206s 99.900% <= 1.407 milliseconds (cumulative count 99900) 206s 99.970% <= 1.503 milliseconds (cumulative count 99970) 206s 100.000% <= 1.607 milliseconds (cumulative count 100000) 206s 206s Summary: 206s throughput summary: 636942.62 requests per second 206s latency summary (msec): 206s avg min p50 p95 p99 max 206s 0.438 0.160 0.391 0.679 0.999 1.567 206s SET: rps=355000.0 (overall: 495810.0) avg_msec=0.746 (overall: 0.746) ====== SET ====== 206s 100000 requests completed in 0.20 seconds 206s 50 parallel clients 206s 3 bytes payload 206s keep alive: 1 206s host configuration "save": 3600 1 300 100 60 10000 206s host configuration "appendonly": no 206s multi-thread: no 206s 206s Latency by percentile distribution: 206s 0.000% <= 0.303 milliseconds (cumulative count 10) 206s 50.000% <= 0.687 milliseconds (cumulative count 51600) 206s 75.000% <= 0.847 milliseconds (cumulative count 75500) 206s 87.500% <= 0.967 milliseconds (cumulative count 87590) 206s 93.750% <= 1.055 milliseconds (cumulative count 94110) 206s 96.875% <= 1.135 milliseconds (cumulative count 96920) 206s 98.438% <= 1.247 milliseconds (cumulative count 98440) 206s 99.219% <= 1.423 milliseconds (cumulative count 99220) 206s 99.609% <= 4.119 milliseconds (cumulative count 99610) 206s 99.805% <= 4.799 milliseconds (cumulative count 99810) 206s 99.902% <= 5.159 milliseconds (cumulative count 99910) 206s 99.951% <= 5.311 milliseconds (cumulative count 99960) 206s 99.976% <= 5.399 milliseconds (cumulative count 99980) 206s 99.988% <= 5.423 milliseconds (cumulative count 99990) 206s 99.994% <= 5.447 milliseconds (cumulative count 100000) 206s 100.000% <= 5.447 milliseconds (cumulative count 100000) 206s 206s Cumulative distribution of latencies: 206s 0.000% <= 0.103 milliseconds (cumulative count 0) 206s 0.010% <= 0.303 milliseconds (cumulative count 10) 206s 1.730% <= 0.407 milliseconds (cumulative count 1730) 206s 5.780% <= 0.503 milliseconds (cumulative count 5780) 206s 27.740% <= 0.607 milliseconds (cumulative count 27740) 206s 54.610% <= 0.703 milliseconds (cumulative count 54610) 206s 70.310% <= 0.807 milliseconds (cumulative count 70310) 206s 81.740% <= 0.903 milliseconds (cumulative count 81740) 206s 90.670% <= 1.007 milliseconds (cumulative count 90670) 206s 95.910% <= 1.103 milliseconds (cumulative count 95910) 206s 98.090% <= 1.207 milliseconds (cumulative count 98090) 206s 98.810% <= 1.303 milliseconds (cumulative count 98810) 206s 99.180% <= 1.407 milliseconds (cumulative count 99180) 206s 99.390% <= 1.503 milliseconds (cumulative count 99390) 206s 99.470% <= 1.607 milliseconds (cumulative count 99470) 206s 99.500% <= 1.703 milliseconds (cumulative count 99500) 206s 99.600% <= 4.103 milliseconds (cumulative count 99600) 206s 99.890% <= 5.103 milliseconds (cumulative count 99890) 206s 100.000% <= 6.103 milliseconds (cumulative count 100000) 206s 206s Summary: 206s throughput summary: 500000.00 requests per second 206s latency summary (msec): 206s avg min p50 p95 p99 max 206s 0.748 0.296 0.687 1.079 1.351 5.447 206s ====== GET ====== 206s 100000 requests completed in 0.17 seconds 206s 50 parallel clients 206s 3 bytes payload 206s keep alive: 1 206s host configuration "save": 3600 1 300 100 60 10000 206s host configuration "appendonly": no 206s multi-thread: no 206s 206s Latency by percentile distribution: 206s 0.000% <= 0.295 milliseconds (cumulative count 20) 206s 50.000% <= 0.567 milliseconds (cumulative count 51120) 206s 75.000% <= 0.711 milliseconds (cumulative count 75310) 206s 87.500% <= 0.839 milliseconds (cumulative count 87790) 206s 93.750% <= 0.935 milliseconds (cumulative count 93940) 206s 96.875% <= 1.023 milliseconds (cumulative count 96900) 206s 98.438% <= 1.103 milliseconds (cumulative count 98460) 206s 99.219% <= 1.191 milliseconds (cumulative count 99250) 206s 99.609% <= 1.295 milliseconds (cumulative count 99610) 206s 99.805% <= 1.375 milliseconds (cumulative count 99810) 206s 99.902% <= 1.423 milliseconds (cumulative count 99910) 206s 99.951% <= 1.519 milliseconds (cumulative count 99960) 206s 99.976% <= 1.543 milliseconds (cumulative count 99980) 206s 99.988% <= 1.559 milliseconds (cumulative count 99990) 206s 99.994% <= 1.567 milliseconds (cumulative count 100000) 206s 100.000% <= 1.567 milliseconds (cumulative count 100000) 206s 206s Cumulative distribution of latencies: 206s 0.000% <= 0.103 milliseconds (cumulative count 0) 206s 0.030% <= 0.303 milliseconds (cumulative count 30) 206s 9.110% <= 0.407 milliseconds (cumulative count 9110) 206s 35.740% <= 0.503 milliseconds (cumulative count 35740) 206s 59.590% <= 0.607 milliseconds (cumulative count 59590) 206s 74.380% <= 0.703 milliseconds (cumulative count 74380) 206s 85.150% <= 0.807 milliseconds (cumulative count 85150) 206s 92.270% <= 0.903 milliseconds (cumulative count 92270) 206s 96.560% <= 1.007 milliseconds (cumulative count 96560) 206s 98.460% <= 1.103 milliseconds (cumulative count 98460) 206s 99.310% <= 1.207 milliseconds (cumulative count 99310) 206s 99.650% <= 1.303 milliseconds (cumulative count 99650) 206s 99.890% <= 1.407 milliseconds (cumulative count 99890) 206s 99.950% <= 1.503 milliseconds (cumulative count 99950) 206s 100.000% <= 1.607 milliseconds (cumulative count 100000) 206s 206s Summary: 206s throughput summary: 581395.31 requests per second 206s latency summary (msec): 206s avg min p50 p95 p99 max 206s 0.607 0.288 0.567 0.967 1.151 1.567 206s INCR: rps=110079.7 (overall: 511666.7) avg_msec=0.840 (overall: 0.840) ====== INCR ====== 206s 100000 requests completed in 0.19 seconds 206s 50 parallel clients 206s 3 bytes payload 206s keep alive: 1 206s host configuration "save": 3600 1 300 100 60 10000 206s host configuration "appendonly": no 206s multi-thread: no 206s 206s Latency by percentile distribution: 206s 0.000% <= 0.255 milliseconds (cumulative count 10) 206s 50.000% <= 0.791 milliseconds (cumulative count 51420) 206s 75.000% <= 0.943 milliseconds (cumulative count 75550) 206s 87.500% <= 1.063 milliseconds (cumulative count 88120) 206s 93.750% <= 1.143 milliseconds (cumulative count 93920) 206s 96.875% <= 1.239 milliseconds (cumulative count 96930) 206s 98.438% <= 1.351 milliseconds (cumulative count 98460) 206s 99.219% <= 1.511 milliseconds (cumulative count 99220) 206s 99.609% <= 4.807 milliseconds (cumulative count 99610) 206s 99.805% <= 4.991 milliseconds (cumulative count 99810) 206s 99.902% <= 5.143 milliseconds (cumulative count 99910) 206s 99.951% <= 5.255 milliseconds (cumulative count 99960) 206s 99.976% <= 5.303 milliseconds (cumulative count 99980) 206s 99.988% <= 5.327 milliseconds (cumulative count 99990) 206s 99.994% <= 5.343 milliseconds (cumulative count 100000) 206s 100.000% <= 5.343 milliseconds (cumulative count 100000) 206s 206s Cumulative distribution of latencies: 206s 0.000% <= 0.103 milliseconds (cumulative count 0) 206s 0.080% <= 0.303 milliseconds (cumulative count 80) 206s 2.080% <= 0.407 milliseconds (cumulative count 2080) 206s 5.810% <= 0.503 milliseconds (cumulative count 5810) 206s 16.170% <= 0.607 milliseconds (cumulative count 16170) 206s 33.920% <= 0.703 milliseconds (cumulative count 33920) 206s 54.410% <= 0.807 milliseconds (cumulative count 54410) 206s 69.750% <= 0.903 milliseconds (cumulative count 69750) 206s 83.180% <= 1.007 milliseconds (cumulative count 83180) 206s 91.370% <= 1.103 milliseconds (cumulative count 91370) 206s 96.140% <= 1.207 milliseconds (cumulative count 96140) 206s 97.930% <= 1.303 milliseconds (cumulative count 97930) 206s 98.760% <= 1.407 milliseconds (cumulative count 98760) 206s 99.200% <= 1.503 milliseconds (cumulative count 99200) 206s 99.360% <= 1.607 milliseconds (cumulative count 99360) 206s 99.450% <= 1.703 milliseconds (cumulative count 99450) 206s 99.490% <= 1.807 milliseconds (cumulative count 99490) 206s 99.500% <= 1.903 milliseconds (cumulative count 99500) 206s 99.880% <= 5.103 milliseconds (cumulative count 99880) 206s 100.000% <= 6.103 milliseconds (cumulative count 100000) 206s 206s Summary: 206s throughput summary: 518134.72 requests per second 206s latency summary (msec): 206s avg min p50 p95 p99 max 206s 0.824 0.248 0.791 1.175 1.447 5.343 207s LPUSH: rps=178560.0 (overall: 409541.3) avg_msec=1.075 (overall: 1.075) ====== LPUSH ====== 207s 100000 requests completed in 0.24 seconds 207s 50 parallel clients 207s 3 bytes payload 207s keep alive: 1 207s host configuration "save": 3600 1 300 100 60 10000 207s host configuration "appendonly": no 207s multi-thread: no 207s 207s Latency by percentile distribution: 207s 0.000% <= 0.287 milliseconds (cumulative count 10) 207s 50.000% <= 1.023 milliseconds (cumulative count 50860) 207s 75.000% <= 1.199 milliseconds (cumulative count 75490) 207s 87.500% <= 1.319 milliseconds (cumulative count 87710) 207s 93.750% <= 1.415 milliseconds (cumulative count 94090) 207s 96.875% <= 1.503 milliseconds (cumulative count 97020) 207s 98.438% <= 1.599 milliseconds (cumulative count 98500) 207s 99.219% <= 1.743 milliseconds (cumulative count 99220) 207s 99.609% <= 4.863 milliseconds (cumulative count 99610) 207s 99.805% <= 5.039 milliseconds (cumulative count 99810) 207s 99.902% <= 5.167 milliseconds (cumulative count 99910) 207s 99.951% <= 5.279 milliseconds (cumulative count 99960) 207s 99.976% <= 5.319 milliseconds (cumulative count 99980) 207s 99.988% <= 5.351 milliseconds (cumulative count 99990) 207s 99.994% <= 5.375 milliseconds (cumulative count 100000) 207s 100.000% <= 5.375 milliseconds (cumulative count 100000) 207s 207s Cumulative distribution of latencies: 207s 0.000% <= 0.103 milliseconds (cumulative count 0) 207s 0.010% <= 0.303 milliseconds (cumulative count 10) 207s 0.180% <= 0.407 milliseconds (cumulative count 180) 207s 1.260% <= 0.503 milliseconds (cumulative count 1260) 207s 4.420% <= 0.607 milliseconds (cumulative count 4420) 207s 10.030% <= 0.703 milliseconds (cumulative count 10030) 207s 20.330% <= 0.807 milliseconds (cumulative count 20330) 207s 32.180% <= 0.903 milliseconds (cumulative count 32180) 207s 48.300% <= 1.007 milliseconds (cumulative count 48300) 207s 62.780% <= 1.103 milliseconds (cumulative count 62780) 207s 76.480% <= 1.207 milliseconds (cumulative count 76480) 207s 86.390% <= 1.303 milliseconds (cumulative count 86390) 207s 93.630% <= 1.407 milliseconds (cumulative count 93630) 207s 97.020% <= 1.503 milliseconds (cumulative count 97020) 207s 98.580% <= 1.607 milliseconds (cumulative count 98580) 207s 99.110% <= 1.703 milliseconds (cumulative count 99110) 207s 99.310% <= 1.807 milliseconds (cumulative count 99310) 207s 99.410% <= 1.903 milliseconds (cumulative count 99410) 207s 99.490% <= 2.007 milliseconds (cumulative count 99490) 207s 99.500% <= 3.103 milliseconds (cumulative count 99500) 207s 99.870% <= 5.103 milliseconds (cumulative count 99870) 207s 100.000% <= 6.103 milliseconds (cumulative count 100000) 207s 207s Summary: 207s throughput summary: 420168.06 requests per second 207s latency summary (msec): 207s avg min p50 p95 p99 max 207s 1.042 0.280 1.023 1.439 1.671 5.375 207s RPUSH: rps=249761.0 (overall: 522416.7) avg_msec=0.824 (overall: 0.824) ====== RPUSH ====== 207s 100000 requests completed in 0.19 seconds 207s 50 parallel clients 207s 3 bytes payload 207s keep alive: 1 207s host configuration "save": 3600 1 300 100 60 10000 207s host configuration "appendonly": no 207s multi-thread: no 207s 207s Latency by percentile distribution: 207s 0.000% <= 0.311 milliseconds (cumulative count 10) 207s 50.000% <= 0.807 milliseconds (cumulative count 50770) 207s 75.000% <= 0.951 milliseconds (cumulative count 75180) 207s 87.500% <= 1.047 milliseconds (cumulative count 87550) 207s 93.750% <= 1.119 milliseconds (cumulative count 94060) 207s 96.875% <= 1.183 milliseconds (cumulative count 96970) 207s 98.438% <= 1.239 milliseconds (cumulative count 98540) 207s 99.219% <= 1.287 milliseconds (cumulative count 99250) 207s 99.609% <= 1.327 milliseconds (cumulative count 99630) 207s 99.805% <= 1.383 milliseconds (cumulative count 99810) 207s 99.902% <= 1.431 milliseconds (cumulative count 99910) 207s 99.951% <= 1.487 milliseconds (cumulative count 99960) 207s 99.976% <= 1.495 milliseconds (cumulative count 99980) 207s 99.988% <= 1.535 milliseconds (cumulative count 99990) 207s 99.994% <= 1.567 milliseconds (cumulative count 100000) 207s 100.000% <= 1.567 milliseconds (cumulative count 100000) 207s 207s Cumulative distribution of latencies: 207s 0.000% <= 0.103 milliseconds (cumulative count 0) 207s 0.640% <= 0.407 milliseconds (cumulative count 640) 207s 2.440% <= 0.503 milliseconds (cumulative count 2440) 207s 6.840% <= 0.607 milliseconds (cumulative count 6840) 207s 28.950% <= 0.703 milliseconds (cumulative count 28950) 207s 50.770% <= 0.807 milliseconds (cumulative count 50770) 207s 68.010% <= 0.903 milliseconds (cumulative count 68010) 207s 82.720% <= 1.007 milliseconds (cumulative count 82720) 207s 92.940% <= 1.103 milliseconds (cumulative count 92940) 207s 97.810% <= 1.207 milliseconds (cumulative count 97810) 207s 99.430% <= 1.303 milliseconds (cumulative count 99430) 207s 99.880% <= 1.407 milliseconds (cumulative count 99880) 207s 99.980% <= 1.503 milliseconds (cumulative count 99980) 207s 100.000% <= 1.607 milliseconds (cumulative count 100000) 207s 207s Summary: 207s throughput summary: 526315.81 requests per second 207s latency summary (msec): 207s avg min p50 p95 p99 max 207s 0.826 0.304 0.807 1.143 1.271 1.567 207s LPOP: rps=299960.0 (overall: 421292.1) avg_msec=1.061 (overall: 1.061) ====== LPOP ====== 207s 100000 requests completed in 0.24 seconds 207s 50 parallel clients 207s 3 bytes payload 207s keep alive: 1 207s host configuration "save": 3600 1 300 100 60 10000 207s host configuration "appendonly": no 207s multi-thread: no 207s 207s Latency by percentile distribution: 207s 0.000% <= 0.391 milliseconds (cumulative count 10) 207s 50.000% <= 1.055 milliseconds (cumulative count 50930) 207s 75.000% <= 1.215 milliseconds (cumulative count 76020) 207s 87.500% <= 1.311 milliseconds (cumulative count 87810) 207s 93.750% <= 1.383 milliseconds (cumulative count 94030) 207s 96.875% <= 1.439 milliseconds (cumulative count 96940) 207s 98.438% <= 1.495 milliseconds (cumulative count 98460) 207s 99.219% <= 1.583 milliseconds (cumulative count 99240) 207s 99.609% <= 1.839 milliseconds (cumulative count 99610) 207s 99.805% <= 2.055 milliseconds (cumulative count 99810) 207s 99.902% <= 2.215 milliseconds (cumulative count 99910) 207s 99.951% <= 2.303 milliseconds (cumulative count 99960) 207s 99.976% <= 2.383 milliseconds (cumulative count 99980) 207s 99.988% <= 2.407 milliseconds (cumulative count 99990) 207s 99.994% <= 2.655 milliseconds (cumulative count 100000) 207s 100.000% <= 2.655 milliseconds (cumulative count 100000) 207s 207s Cumulative distribution of latencies: 207s 0.000% <= 0.103 milliseconds (cumulative count 0) 207s 0.020% <= 0.407 milliseconds (cumulative count 20) 207s 0.190% <= 0.503 milliseconds (cumulative count 190) 207s 0.460% <= 0.607 milliseconds (cumulative count 460) 207s 1.320% <= 0.703 milliseconds (cumulative count 1320) 207s 11.160% <= 0.807 milliseconds (cumulative count 11160) 207s 25.530% <= 0.903 milliseconds (cumulative count 25530) 207s 42.680% <= 1.007 milliseconds (cumulative count 42680) 207s 58.960% <= 1.103 milliseconds (cumulative count 58960) 207s 74.890% <= 1.207 milliseconds (cumulative count 74890) 207s 87.030% <= 1.303 milliseconds (cumulative count 87030) 207s 95.530% <= 1.407 milliseconds (cumulative count 95530) 207s 98.580% <= 1.503 milliseconds (cumulative count 98580) 207s 99.300% <= 1.607 milliseconds (cumulative count 99300) 207s 99.450% <= 1.703 milliseconds (cumulative count 99450) 207s 99.550% <= 1.807 milliseconds (cumulative count 99550) 207s 99.690% <= 1.903 milliseconds (cumulative count 99690) 207s 99.770% <= 2.007 milliseconds (cumulative count 99770) 207s 99.850% <= 2.103 milliseconds (cumulative count 99850) 207s 100.000% <= 3.103 milliseconds (cumulative count 100000) 207s 207s Summary: 207s throughput summary: 420168.06 requests per second 207s latency summary (msec): 207s avg min p50 p95 p99 max 207s 1.063 0.384 1.055 1.407 1.535 2.655 207s RPOP: rps=352400.0 (overall: 466137.6) avg_msec=0.950 (overall: 0.950) ====== RPOP ====== 207s 100000 requests completed in 0.22 seconds 207s 50 parallel clients 207s 3 bytes payload 207s keep alive: 1 207s host configuration "save": 3600 1 300 100 60 10000 207s host configuration "appendonly": no 207s multi-thread: no 207s 207s Latency by percentile distribution: 207s 0.000% <= 0.279 milliseconds (cumulative count 10) 207s 50.000% <= 0.935 milliseconds (cumulative count 50230) 207s 75.000% <= 1.095 milliseconds (cumulative count 75740) 207s 87.500% <= 1.191 milliseconds (cumulative count 88010) 207s 93.750% <= 1.247 milliseconds (cumulative count 93780) 207s 96.875% <= 1.311 milliseconds (cumulative count 96910) 207s 98.438% <= 1.375 milliseconds (cumulative count 98440) 207s 99.219% <= 1.439 milliseconds (cumulative count 99230) 207s 99.609% <= 1.503 milliseconds (cumulative count 99650) 207s 99.805% <= 1.559 milliseconds (cumulative count 99810) 207s 99.902% <= 1.631 milliseconds (cumulative count 99920) 207s 99.951% <= 1.663 milliseconds (cumulative count 99960) 207s 99.976% <= 1.679 milliseconds (cumulative count 99990) 207s 99.994% <= 1.695 milliseconds (cumulative count 100000) 207s 100.000% <= 1.695 milliseconds (cumulative count 100000) 207s 207s Cumulative distribution of latencies: 207s 0.000% <= 0.103 milliseconds (cumulative count 0) 207s 0.040% <= 0.303 milliseconds (cumulative count 40) 207s 0.430% <= 0.407 milliseconds (cumulative count 430) 207s 0.930% <= 0.503 milliseconds (cumulative count 930) 207s 1.730% <= 0.607 milliseconds (cumulative count 1730) 207s 5.240% <= 0.703 milliseconds (cumulative count 5240) 207s 25.890% <= 0.807 milliseconds (cumulative count 25890) 207s 44.600% <= 0.903 milliseconds (cumulative count 44600) 207s 62.360% <= 1.007 milliseconds (cumulative count 62360) 207s 76.770% <= 1.103 milliseconds (cumulative count 76770) 207s 89.880% <= 1.207 milliseconds (cumulative count 89880) 207s 96.720% <= 1.303 milliseconds (cumulative count 96720) 207s 98.880% <= 1.407 milliseconds (cumulative count 98880) 207s 99.650% <= 1.503 milliseconds (cumulative count 99650) 207s 99.890% <= 1.607 milliseconds (cumulative count 99890) 207s 100.000% <= 1.703 milliseconds (cumulative count 100000) 207s 207s Summary: 207s throughput summary: 465116.28 requests per second 207s latency summary (msec): 207s avg min p50 p95 p99 max 207s 0.952 0.272 0.935 1.271 1.423 1.695 207s ====== SADD ====== 207s 100000 requests completed in 0.18 seconds 207s 50 parallel clients 207s 3 bytes payload 207s keep alive: 1 207s host configuration "save": 3600 1 300 100 60 10000 207s host configuration "appendonly": no 207s multi-thread: no 207s 207s Latency by percentile distribution: 207s 0.000% <= 0.295 milliseconds (cumulative count 10) 207s 50.000% <= 0.751 milliseconds (cumulative count 50340) 207s 75.000% <= 0.903 milliseconds (cumulative count 75050) 207s 87.500% <= 1.015 milliseconds (cumulative count 87780) 207s 93.750% <= 1.135 milliseconds (cumulative count 93860) 207s 96.875% <= 1.287 milliseconds (cumulative count 96990) 207s 98.438% <= 1.455 milliseconds (cumulative count 98440) 207s 99.219% <= 1.599 milliseconds (cumulative count 99220) 207s 99.609% <= 1.871 milliseconds (cumulative count 99610) 207s 99.805% <= 2.039 milliseconds (cumulative count 99810) 207s 99.902% <= 2.151 milliseconds (cumulative count 99910) 207s 99.951% <= 2.207 milliseconds (cumulative count 99960) 207s 99.976% <= 2.239 milliseconds (cumulative count 99980) 207s 99.988% <= 2.271 milliseconds (cumulative count 99990) 207s 99.994% <= 2.567 milliseconds (cumulative count 100000) 207s 100.000% <= 2.567 milliseconds (cumulative count 100000) 207s 207s Cumulative distribution of latencies: 207s 0.000% <= 0.103 milliseconds (cumulative count 0) 207s 0.040% <= 0.303 milliseconds (cumulative count 40) 207s 1.220% <= 0.407 milliseconds (cumulative count 1220) 207s 3.660% <= 0.503 milliseconds (cumulative count 3660) 207s 14.720% <= 0.607 milliseconds (cumulative count 14720) 207s 41.580% <= 0.703 milliseconds (cumulative count 41580) 207s 60.550% <= 0.807 milliseconds (cumulative count 60550) 207s 75.050% <= 0.903 milliseconds (cumulative count 75050) 207s 87.190% <= 1.007 milliseconds (cumulative count 87190) 207s 92.750% <= 1.103 milliseconds (cumulative count 92750) 207s 95.670% <= 1.207 milliseconds (cumulative count 95670) 207s 97.170% <= 1.303 milliseconds (cumulative count 97170) 207s 98.130% <= 1.407 milliseconds (cumulative count 98130) 207s 98.710% <= 1.503 milliseconds (cumulative count 98710) 207s 99.250% <= 1.607 milliseconds (cumulative count 99250) 207s 99.440% <= 1.703 milliseconds (cumulative count 99440) 207s 99.530% <= 1.807 milliseconds (cumulative count 99530) 207s 99.660% <= 1.903 milliseconds (cumulative count 99660) 207s 99.780% <= 2.007 milliseconds (cumulative count 99780) 207s 99.880% <= 2.103 milliseconds (cumulative count 99880) 207s 100.000% <= 3.103 milliseconds (cumulative count 100000) 207s 207s Summary: 207s throughput summary: 549450.56 requests per second 207s latency summary (msec): 207s avg min p50 p95 p99 max 207s 0.792 0.288 0.751 1.175 1.543 2.567 208s HSET: rps=69123.5 (overall: 444871.8) avg_msec=0.979 (overall: 0.979) ====== HSET ====== 208s 100000 requests completed in 0.21 seconds 208s 50 parallel clients 208s 3 bytes payload 208s keep alive: 1 208s host configuration "save": 3600 1 300 100 60 10000 208s host configuration "appendonly": no 208s multi-thread: no 208s 208s Latency by percentile distribution: 208s 0.000% <= 0.335 milliseconds (cumulative count 20) 208s 50.000% <= 0.927 milliseconds (cumulative count 50650) 208s 75.000% <= 1.079 milliseconds (cumulative count 75410) 208s 87.500% <= 1.183 milliseconds (cumulative count 88300) 208s 93.750% <= 1.239 milliseconds (cumulative count 93810) 208s 96.875% <= 1.303 milliseconds (cumulative count 97130) 208s 98.438% <= 1.351 milliseconds (cumulative count 98540) 208s 99.219% <= 1.399 milliseconds (cumulative count 99280) 208s 99.609% <= 1.455 milliseconds (cumulative count 99610) 208s 99.805% <= 1.511 milliseconds (cumulative count 99810) 208s 99.902% <= 1.567 milliseconds (cumulative count 99920) 208s 99.951% <= 1.631 milliseconds (cumulative count 99960) 208s 99.976% <= 1.663 milliseconds (cumulative count 99980) 208s 99.988% <= 1.679 milliseconds (cumulative count 99990) 208s 99.994% <= 1.687 milliseconds (cumulative count 100000) 208s 100.000% <= 1.687 milliseconds (cumulative count 100000) 208s 208s Cumulative distribution of latencies: 208s 0.000% <= 0.103 milliseconds (cumulative count 0) 208s 0.200% <= 0.407 milliseconds (cumulative count 200) 208s 0.780% <= 0.503 milliseconds (cumulative count 780) 208s 1.700% <= 0.607 milliseconds (cumulative count 1700) 208s 5.320% <= 0.703 milliseconds (cumulative count 5320) 208s 27.120% <= 0.807 milliseconds (cumulative count 27120) 208s 46.300% <= 0.903 milliseconds (cumulative count 46300) 208s 64.110% <= 1.007 milliseconds (cumulative count 64110) 208s 78.450% <= 1.103 milliseconds (cumulative count 78450) 208s 91.140% <= 1.207 milliseconds (cumulative count 91140) 208s 97.130% <= 1.303 milliseconds (cumulative count 97130) 208s 99.360% <= 1.407 milliseconds (cumulative count 99360) 208s 99.790% <= 1.503 milliseconds (cumulative count 99790) 208s 99.950% <= 1.607 milliseconds (cumulative count 99950) 208s 100.000% <= 1.703 milliseconds (cumulative count 100000) 208s 208s Summary: 208s throughput summary: 467289.72 requests per second 208s latency summary (msec): 208s avg min p50 p95 p99 max 208s 0.945 0.328 0.927 1.263 1.383 1.687 208s SPOP: rps=171040.0 (overall: 585753.4) avg_msec=0.712 (overall: 0.712) ====== SPOP ====== 208s 100000 requests completed in 0.16 seconds 208s 50 parallel clients 208s 3 bytes payload 208s keep alive: 1 208s host configuration "save": 3600 1 300 100 60 10000 208s host configuration "appendonly": no 208s multi-thread: no 208s 208s Latency by percentile distribution: 208s 0.000% <= 0.271 milliseconds (cumulative count 10) 208s 50.000% <= 0.607 milliseconds (cumulative count 50460) 208s 75.000% <= 0.759 milliseconds (cumulative count 75170) 208s 87.500% <= 0.871 milliseconds (cumulative count 87760) 208s 93.750% <= 0.983 milliseconds (cumulative count 93870) 208s 96.875% <= 1.079 milliseconds (cumulative count 96920) 208s 98.438% <= 1.159 milliseconds (cumulative count 98490) 208s 99.219% <= 1.263 milliseconds (cumulative count 99270) 208s 99.609% <= 1.327 milliseconds (cumulative count 99670) 208s 99.805% <= 1.367 milliseconds (cumulative count 99810) 208s 99.902% <= 1.423 milliseconds (cumulative count 99910) 208s 99.951% <= 1.487 milliseconds (cumulative count 99960) 208s 99.976% <= 1.599 milliseconds (cumulative count 99980) 208s 99.988% <= 1.647 milliseconds (cumulative count 99990) 208s 99.994% <= 1.719 milliseconds (cumulative count 100000) 208s 100.000% <= 1.719 milliseconds (cumulative count 100000) 208s 208s Cumulative distribution of latencies: 208s 0.000% <= 0.103 milliseconds (cumulative count 0) 208s 0.290% <= 0.303 milliseconds (cumulative count 290) 208s 11.010% <= 0.407 milliseconds (cumulative count 11010) 208s 28.410% <= 0.503 milliseconds (cumulative count 28410) 208s 50.460% <= 0.607 milliseconds (cumulative count 50460) 208s 66.560% <= 0.703 milliseconds (cumulative count 66560) 208s 81.100% <= 0.807 milliseconds (cumulative count 81100) 208s 89.960% <= 0.903 milliseconds (cumulative count 89960) 208s 94.810% <= 1.007 milliseconds (cumulative count 94810) 208s 97.570% <= 1.103 milliseconds (cumulative count 97570) 208s 98.830% <= 1.207 milliseconds (cumulative count 98830) 208s 99.510% <= 1.303 milliseconds (cumulative count 99510) 208s 99.890% <= 1.407 milliseconds (cumulative count 99890) 208s 99.960% <= 1.503 milliseconds (cumulative count 99960) 208s 99.980% <= 1.607 milliseconds (cumulative count 99980) 208s 99.990% <= 1.703 milliseconds (cumulative count 99990) 208s 100.000% <= 1.807 milliseconds (cumulative count 100000) 208s 208s Summary: 208s throughput summary: 625000.00 requests per second 208s latency summary (msec): 208s avg min p50 p95 p99 max 208s 0.638 0.264 0.607 1.015 1.231 1.719 208s ZADD: rps=278844.6 (overall: 432037.0) avg_msec=0.997 (overall: 0.997) ====== ZADD ====== 208s 100000 requests completed in 0.23 seconds 208s 50 parallel clients 208s 3 bytes payload 208s keep alive: 1 208s host configuration "save": 3600 1 300 100 60 10000 208s host configuration "appendonly": no 208s multi-thread: no 208s 208s Latency by percentile distribution: 208s 0.000% <= 0.287 milliseconds (cumulative count 10) 208s 50.000% <= 0.943 milliseconds (cumulative count 50050) 208s 75.000% <= 1.119 milliseconds (cumulative count 75540) 208s 87.500% <= 1.231 milliseconds (cumulative count 87640) 208s 93.750% <= 1.335 milliseconds (cumulative count 93950) 208s 96.875% <= 1.463 milliseconds (cumulative count 96930) 208s 98.438% <= 1.743 milliseconds (cumulative count 98440) 208s 99.219% <= 4.775 milliseconds (cumulative count 99220) 208s 99.609% <= 5.455 milliseconds (cumulative count 99610) 208s 99.805% <= 5.767 milliseconds (cumulative count 99810) 208s 99.902% <= 5.895 milliseconds (cumulative count 99910) 208s 99.951% <= 6.007 milliseconds (cumulative count 99960) 208s 99.976% <= 6.047 milliseconds (cumulative count 99980) 208s 99.988% <= 6.111 milliseconds (cumulative count 99990) 208s 99.994% <= 6.127 milliseconds (cumulative count 100000) 208s 100.000% <= 6.127 milliseconds (cumulative count 100000) 208s 208s Cumulative distribution of latencies: 208s 0.000% <= 0.103 milliseconds (cumulative count 0) 208s 0.060% <= 0.303 milliseconds (cumulative count 60) 208s 1.250% <= 0.407 milliseconds (cumulative count 1250) 208s 2.650% <= 0.503 milliseconds (cumulative count 2650) 208s 5.180% <= 0.607 milliseconds (cumulative count 5180) 208s 9.640% <= 0.703 milliseconds (cumulative count 9640) 208s 26.920% <= 0.807 milliseconds (cumulative count 26920) 208s 43.760% <= 0.903 milliseconds (cumulative count 43760) 208s 59.780% <= 1.007 milliseconds (cumulative count 59780) 208s 73.580% <= 1.103 milliseconds (cumulative count 73580) 208s 85.320% <= 1.207 milliseconds (cumulative count 85320) 208s 92.400% <= 1.303 milliseconds (cumulative count 92400) 208s 96.120% <= 1.407 milliseconds (cumulative count 96120) 208s 97.440% <= 1.503 milliseconds (cumulative count 97440) 208s 97.990% <= 1.607 milliseconds (cumulative count 97990) 208s 98.310% <= 1.703 milliseconds (cumulative count 98310) 208s 98.610% <= 1.807 milliseconds (cumulative count 98610) 208s 98.810% <= 1.903 milliseconds (cumulative count 98810) 208s 98.950% <= 2.007 milliseconds (cumulative count 98950) 208s 99.000% <= 2.103 milliseconds (cumulative count 99000) 208s 99.470% <= 5.103 milliseconds (cumulative count 99470) 208s 99.980% <= 6.103 milliseconds (cumulative count 99980) 208s 100.000% <= 7.103 milliseconds (cumulative count 100000) 208s 208s Summary: 208s throughput summary: 434782.59 requests per second 208s latency summary (msec): 208s avg min p50 p95 p99 max 208s 1.001 0.280 0.943 1.367 2.095 6.127 208s ====== ZPOPMIN ====== 208s 100000 requests completed in 0.15 seconds 208s 50 parallel clients 208s 3 bytes payload 208s keep alive: 1 208s host configuration "save": 3600 1 300 100 60 10000 208s host configuration "appendonly": no 208s multi-thread: no 208s 208s Latency by percentile distribution: 208s 0.000% <= 0.175 milliseconds (cumulative count 10) 208s 50.000% <= 0.575 milliseconds (cumulative count 51980) 208s 75.000% <= 0.687 milliseconds (cumulative count 75530) 208s 87.500% <= 0.791 milliseconds (cumulative count 87960) 208s 93.750% <= 0.863 milliseconds (cumulative count 94330) 208s 96.875% <= 0.935 milliseconds (cumulative count 97180) 208s 98.438% <= 0.991 milliseconds (cumulative count 98440) 208s 99.219% <= 1.103 milliseconds (cumulative count 99260) 208s 99.609% <= 1.167 milliseconds (cumulative count 99610) 208s 99.805% <= 1.231 milliseconds (cumulative count 99840) 208s 99.902% <= 1.295 milliseconds (cumulative count 99910) 208s 99.951% <= 1.383 milliseconds (cumulative count 99960) 208s 99.976% <= 1.439 milliseconds (cumulative count 99980) 208s 99.988% <= 1.455 milliseconds (cumulative count 99990) 208s 99.994% <= 1.511 milliseconds (cumulative count 100000) 208s 100.000% <= 1.511 milliseconds (cumulative count 100000) 208s 208s Cumulative distribution of latencies: 208s 0.000% <= 0.103 milliseconds (cumulative count 0) 208s 0.020% <= 0.207 milliseconds (cumulative count 20) 208s 0.440% <= 0.303 milliseconds (cumulative count 440) 208s 12.480% <= 0.407 milliseconds (cumulative count 12480) 208s 31.960% <= 0.503 milliseconds (cumulative count 31960) 208s 60.730% <= 0.607 milliseconds (cumulative count 60730) 208s 77.610% <= 0.703 milliseconds (cumulative count 77610) 208s 89.610% <= 0.807 milliseconds (cumulative count 89610) 208s 96.010% <= 0.903 milliseconds (cumulative count 96010) 208s 98.620% <= 1.007 milliseconds (cumulative count 98620) 208s 99.260% <= 1.103 milliseconds (cumulative count 99260) 208s 99.740% <= 1.207 milliseconds (cumulative count 99740) 208s 99.920% <= 1.303 milliseconds (cumulative count 99920) 208s 99.970% <= 1.407 milliseconds (cumulative count 99970) 208s 99.990% <= 1.503 milliseconds (cumulative count 99990) 208s 100.000% <= 1.607 milliseconds (cumulative count 100000) 208s 208s Summary: 208s throughput summary: 653594.81 requests per second 208s latency summary (msec): 208s avg min p50 p95 p99 max 208s 0.590 0.168 0.575 0.879 1.055 1.511 208s LPUSH (needed to benchmark LRANGE): rps=46200.0 (overall: 462000.0) avg_msec=0.909 (overall: 0.909) ====== LPUSH (needed to benchmark LRANGE) ====== 208s 100000 requests completed in 0.21 seconds 208s 50 parallel clients 208s 3 bytes payload 208s keep alive: 1 208s host configuration "save": 3600 1 300 100 60 10000 208s host configuration "appendonly": no 208s multi-thread: no 208s 208s Latency by percentile distribution: 208s 0.000% <= 0.295 milliseconds (cumulative count 10) 208s 50.000% <= 0.903 milliseconds (cumulative count 50270) 208s 75.000% <= 1.071 milliseconds (cumulative count 75860) 208s 87.500% <= 1.175 milliseconds (cumulative count 88010) 208s 93.750% <= 1.255 milliseconds (cumulative count 94200) 208s 96.875% <= 1.327 milliseconds (cumulative count 96890) 208s 98.438% <= 1.407 milliseconds (cumulative count 98440) 208s 99.219% <= 1.511 milliseconds (cumulative count 99250) 208s 99.609% <= 1.591 milliseconds (cumulative count 99650) 208s 99.805% <= 1.631 milliseconds (cumulative count 99810) 208s 99.902% <= 1.703 milliseconds (cumulative count 99930) 208s 99.951% <= 1.759 milliseconds (cumulative count 99960) 208s 99.976% <= 1.807 milliseconds (cumulative count 99980) 208s 99.988% <= 1.823 milliseconds (cumulative count 99990) 208s 99.994% <= 1.887 milliseconds (cumulative count 100000) 208s 100.000% <= 1.887 milliseconds (cumulative count 100000) 208s 208s Cumulative distribution of latencies: 208s 0.000% <= 0.103 milliseconds (cumulative count 0) 208s 0.030% <= 0.303 milliseconds (cumulative count 30) 208s 1.390% <= 0.407 milliseconds (cumulative count 1390) 208s 3.280% <= 0.503 milliseconds (cumulative count 3280) 208s 6.380% <= 0.607 milliseconds (cumulative count 6380) 208s 14.740% <= 0.703 milliseconds (cumulative count 14740) 208s 33.370% <= 0.807 milliseconds (cumulative count 33370) 208s 50.270% <= 0.903 milliseconds (cumulative count 50270) 208s 67.030% <= 1.007 milliseconds (cumulative count 67030) 208s 80.030% <= 1.103 milliseconds (cumulative count 80030) 208s 91.070% <= 1.207 milliseconds (cumulative count 91070) 208s 96.130% <= 1.303 milliseconds (cumulative count 96130) 208s 98.440% <= 1.407 milliseconds (cumulative count 98440) 208s 99.200% <= 1.503 milliseconds (cumulative count 99200) 208s 99.720% <= 1.607 milliseconds (cumulative count 99720) 208s 99.930% <= 1.703 milliseconds (cumulative count 99930) 208s 99.980% <= 1.807 milliseconds (cumulative count 99980) 208s 100.000% <= 1.903 milliseconds (cumulative count 100000) 208s 208s Summary: 208s throughput summary: 473933.66 requests per second 208s latency summary (msec): 208s avg min p50 p95 p99 max 208s 0.915 0.288 0.903 1.279 1.471 1.887 209s LRANGE_100 (first 100 elements): rps=29055.1 (overall: 111818.2) avg_msec=3.117 (overall: 3.117) LRANGE_100 (first 100 elements): rps=112828.7 (overall: 112618.3) avg_msec=3.317 (overall: 3.275) LRANGE_100 (first 100 elements): rps=120360.0 (overall: 116031.8) avg_msec=2.994 (overall: 3.147) LRANGE_100 (first 100 elements): rps=114143.4 (overall: 115452.3) avg_msec=3.125 (overall: 3.140) ====== LRANGE_100 (first 100 elements) ====== 209s 100000 requests completed in 0.87 seconds 209s 50 parallel clients 209s 3 bytes payload 209s keep alive: 1 209s host configuration "save": 3600 1 300 100 60 10000 209s host configuration "appendonly": no 209s multi-thread: no 209s 209s Latency by percentile distribution: 209s 0.000% <= 0.479 milliseconds (cumulative count 10) 209s 50.000% <= 2.975 milliseconds (cumulative count 50070) 209s 75.000% <= 3.567 milliseconds (cumulative count 75230) 209s 87.500% <= 4.135 milliseconds (cumulative count 87500) 209s 93.750% <= 4.919 milliseconds (cumulative count 93760) 209s 96.875% <= 5.599 milliseconds (cumulative count 96890) 209s 98.438% <= 6.095 milliseconds (cumulative count 98460) 209s 99.219% <= 6.895 milliseconds (cumulative count 99220) 209s 99.609% <= 7.551 milliseconds (cumulative count 99610) 209s 99.805% <= 8.127 milliseconds (cumulative count 99810) 209s 99.902% <= 8.511 milliseconds (cumulative count 99920) 209s 99.951% <= 8.751 milliseconds (cumulative count 99960) 209s 99.976% <= 8.831 milliseconds (cumulative count 99980) 209s 99.988% <= 8.911 milliseconds (cumulative count 99990) 209s 99.994% <= 8.991 milliseconds (cumulative count 100000) 209s 100.000% <= 8.991 milliseconds (cumulative count 100000) 209s 209s Cumulative distribution of latencies: 209s 0.000% <= 0.103 milliseconds (cumulative count 0) 209s 0.010% <= 0.503 milliseconds (cumulative count 10) 209s 0.020% <= 1.303 milliseconds (cumulative count 20) 209s 0.080% <= 1.407 milliseconds (cumulative count 80) 209s 0.200% <= 1.503 milliseconds (cumulative count 200) 209s 0.420% <= 1.607 milliseconds (cumulative count 420) 209s 0.880% <= 1.703 milliseconds (cumulative count 880) 209s 1.870% <= 1.807 milliseconds (cumulative count 1870) 209s 3.170% <= 1.903 milliseconds (cumulative count 3170) 209s 5.220% <= 2.007 milliseconds (cumulative count 5220) 209s 8.000% <= 2.103 milliseconds (cumulative count 8000) 209s 56.230% <= 3.103 milliseconds (cumulative count 56230) 209s 87.200% <= 4.103 milliseconds (cumulative count 87200) 209s 94.670% <= 5.103 milliseconds (cumulative count 94670) 209s 98.460% <= 6.103 milliseconds (cumulative count 98460) 209s 99.340% <= 7.103 milliseconds (cumulative count 99340) 209s 99.800% <= 8.103 milliseconds (cumulative count 99800) 209s 100.000% <= 9.103 milliseconds (cumulative count 100000) 209s 209s Summary: 209s throughput summary: 115074.80 requests per second 209s latency summary (msec): 209s avg min p50 p95 p99 max 209s 3.159 0.472 2.975 5.175 6.559 8.991 213s LRANGE_300 (first 300 elements): rps=18015.8 (overall: 22790.0) avg_msec=12.393 (overall: 12.393) LRANGE_300 (first 300 elements): rps=27640.6 (overall: 25513.2) avg_msec=9.885 (overall: 10.868) LRANGE_300 (first 300 elements): rps=28945.3 (overall: 26747.2) avg_msec=8.228 (overall: 9.840) LRANGE_300 (first 300 elements): rps=29047.6 (overall: 27348.5) avg_msec=8.636 (overall: 9.506) LRANGE_300 (first 300 elements): rps=29864.5 (overall: 27868.3) avg_msec=8.011 (overall: 9.175) LRANGE_300 (first 300 elements): rps=29565.2 (overall: 28160.8) avg_msec=8.396 (overall: 9.034) LRANGE_300 (first 300 elements): rps=21235.1 (overall: 27149.5) avg_msec=13.692 (overall: 9.566) LRANGE_300 (first 300 elements): rps=24916.3 (overall: 26865.0) avg_msec=11.616 (overall: 9.808) LRANGE_300 (first 300 elements): rps=25310.1 (overall: 26684.9) avg_msec=11.228 (overall: 9.964) LRANGE_300 (first 300 elements): rps=27043.5 (overall: 26721.5) avg_msec=9.467 (overall: 9.913) LRANGE_300 (first 300 elements): rps=26793.7 (overall: 26728.1) avg_msec=9.303 (overall: 9.856) LRANGE_300 (first 300 elements): rps=25920.3 (overall: 26660.2) avg_msec=9.917 (overall: 9.861) LRANGE_300 (first 300 elements): rps=25341.3 (overall: 26557.5) avg_msec=10.272 (overall: 9.892) LRANGE_300 (first 300 elements): rps=26031.7 (overall: 26519.5) avg_msec=9.843 (overall: 9.888) LRANGE_300 (first 300 elements): rps=26706.3 (overall: 26532.1) avg_msec=8.879 (overall: 9.820) ====== LRANGE_300 (first 300 elements) ====== 213s 100000 requests completed in 3.77 seconds 213s 50 parallel clients 213s 3 bytes payload 213s keep alive: 1 213s host configuration "save": 3600 1 300 100 60 10000 213s host configuration "appendonly": no 213s multi-thread: no 213s 213s Latency by percentile distribution: 213s 0.000% <= 1.047 milliseconds (cumulative count 10) 213s 50.000% <= 9.063 milliseconds (cumulative count 50050) 213s 75.000% <= 11.807 milliseconds (cumulative count 75010) 213s 87.500% <= 14.623 milliseconds (cumulative count 87510) 213s 93.750% <= 16.911 milliseconds (cumulative count 93760) 213s 96.875% <= 18.927 milliseconds (cumulative count 96880) 213s 98.438% <= 21.071 milliseconds (cumulative count 98440) 213s 99.219% <= 22.767 milliseconds (cumulative count 99230) 213s 99.609% <= 23.471 milliseconds (cumulative count 99610) 213s 99.805% <= 24.015 milliseconds (cumulative count 99810) 213s 99.902% <= 24.447 milliseconds (cumulative count 99910) 213s 99.951% <= 25.023 milliseconds (cumulative count 99960) 213s 99.976% <= 25.247 milliseconds (cumulative count 99980) 213s 99.988% <= 25.343 milliseconds (cumulative count 99990) 213s 99.994% <= 25.567 milliseconds (cumulative count 100000) 213s 100.000% <= 25.567 milliseconds (cumulative count 100000) 213s 213s Cumulative distribution of latencies: 213s 0.000% <= 0.103 milliseconds (cumulative count 0) 213s 0.050% <= 1.103 milliseconds (cumulative count 50) 213s 0.120% <= 1.207 milliseconds (cumulative count 120) 213s 0.200% <= 1.303 milliseconds (cumulative count 200) 213s 0.270% <= 1.407 milliseconds (cumulative count 270) 213s 0.390% <= 1.503 milliseconds (cumulative count 390) 213s 0.550% <= 1.607 milliseconds (cumulative count 550) 213s 0.660% <= 1.703 milliseconds (cumulative count 660) 213s 0.870% <= 1.807 milliseconds (cumulative count 870) 213s 0.940% <= 1.903 milliseconds (cumulative count 940) 213s 1.140% <= 2.007 milliseconds (cumulative count 1140) 213s 1.250% <= 2.103 milliseconds (cumulative count 1250) 213s 2.120% <= 3.103 milliseconds (cumulative count 2120) 213s 3.280% <= 4.103 milliseconds (cumulative count 3280) 213s 7.140% <= 5.103 milliseconds (cumulative count 7140) 213s 13.430% <= 6.103 milliseconds (cumulative count 13430) 213s 23.650% <= 7.103 milliseconds (cumulative count 23650) 213s 37.190% <= 8.103 milliseconds (cumulative count 37190) 213s 50.520% <= 9.103 milliseconds (cumulative count 50520) 213s 62.060% <= 10.103 milliseconds (cumulative count 62060) 213s 70.670% <= 11.103 milliseconds (cumulative count 70670) 213s 76.650% <= 12.103 milliseconds (cumulative count 76650) 213s 81.470% <= 13.103 milliseconds (cumulative count 81470) 213s 85.630% <= 14.103 milliseconds (cumulative count 85630) 213s 89.030% <= 15.103 milliseconds (cumulative count 89030) 213s 91.790% <= 16.103 milliseconds (cumulative count 91790) 213s 94.230% <= 17.103 milliseconds (cumulative count 94230) 213s 95.800% <= 18.111 milliseconds (cumulative count 95800) 213s 97.110% <= 19.103 milliseconds (cumulative count 97110) 213s 98.020% <= 20.111 milliseconds (cumulative count 98020) 213s 98.460% <= 21.103 milliseconds (cumulative count 98460) 213s 98.910% <= 22.111 milliseconds (cumulative count 98910) 213s 99.400% <= 23.103 milliseconds (cumulative count 99400) 213s 99.840% <= 24.111 milliseconds (cumulative count 99840) 213s 99.960% <= 25.103 milliseconds (cumulative count 99960) 213s 100.000% <= 26.111 milliseconds (cumulative count 100000) 213s 213s Summary: 213s throughput summary: 26525.20 requests per second 213s latency summary (msec): 213s avg min p50 p95 p99 max 213s 9.824 1.040 9.063 17.519 22.367 25.567 221s LRANGE_500 (first 500 elements): rps=10091.3 (overall: 11559.1) avg_msec=21.246 (overall: 21.246) LRANGE_500 (first 500 elements): rps=9868.5 (overall: 10658.2) avg_msec=19.606 (overall: 20.437) LRANGE_500 (first 500 elements): rps=14759.8 (overall: 12095.2) avg_msec=13.203 (overall: 17.344) LRANGE_500 (first 500 elements): rps=14325.5 (overall: 12675.5) avg_msec=14.439 (overall: 16.490) LRANGE_500 (first 500 elements): rps=15468.7 (overall: 13254.0) avg_msec=12.461 (overall: 15.516) LRANGE_500 (first 500 elements): rps=14215.1 (overall: 13416.3) avg_msec=15.054 (overall: 15.433) LRANGE_500 (first 500 elements): rps=12739.3 (overall: 13316.5) avg_msec=19.533 (overall: 16.011) LRANGE_500 (first 500 elements): rps=11901.2 (overall: 13137.2) avg_msec=19.276 (overall: 16.386) LRANGE_500 (first 500 elements): rps=13015.9 (overall: 13123.7) avg_msec=18.828 (overall: 16.656) LRANGE_500 (first 500 elements): rps=12027.8 (overall: 13013.2) avg_msec=18.694 (overall: 16.846) LRANGE_500 (first 500 elements): rps=12126.5 (overall: 12931.7) avg_msec=21.337 (overall: 17.233) LRANGE_500 (first 500 elements): rps=11231.1 (overall: 12789.6) avg_msec=22.266 (overall: 17.602) LRANGE_500 (first 500 elements): rps=11704.3 (overall: 12704.1) avg_msec=21.931 (overall: 17.917) LRANGE_500 (first 500 elements): rps=7793.8 (overall: 12345.4) avg_msec=41.185 (overall: 18.990) LRANGE_500 (first 500 elements): rps=11286.3 (overall: 12272.0) avg_msec=24.516 (overall: 19.342) LRANGE_500 (first 500 elements): rps=12865.4 (overall: 12310.1) avg_msec=18.214 (overall: 19.266) LRANGE_500 (first 500 elements): rps=12015.5 (overall: 12292.5) avg_msec=21.603 (overall: 19.403) LRANGE_500 (first 500 elements): rps=10175.8 (overall: 12173.5) avg_msec=22.732 (overall: 19.560) LRANGE_500 (first 500 elements): rps=9836.6 (overall: 12048.6) avg_msec=22.776 (overall: 19.700) LRANGE_500 (first 500 elements): rps=13231.1 (overall: 12107.3) avg_msec=19.382 (overall: 19.683) LRANGE_500 (first 500 elements): rps=12502.0 (overall: 12126.1) avg_msec=17.349 (overall: 19.568) LRANGE_500 (first 500 elements): rps=11629.8 (overall: 12102.7) avg_msec=22.066 (overall: 19.681) LRANGE_500 (first 500 elements): rps=11546.9 (overall: 12078.3) avg_msec=25.324 (overall: 19.918) LRANGE_500 (first 500 elements): rps=13374.5 (overall: 12131.8) avg_msec=21.363 (overall: 19.983) LRANGE_500 (first 500 elements): rps=11745.1 (overall: 12116.3) avg_msec=22.167 (overall: 20.069) LRANGE_500 (first 500 elements): rps=13300.4 (overall: 12161.7) avg_msec=17.688 (overall: 19.969) LRANGE_500 (first 500 elements): rps=13652.2 (overall: 12216.8) avg_msec=16.238 (overall: 19.815) LRANGE_500 (first 500 elements): rps=13275.6 (overall: 12254.7) avg_msec=17.936 (overall: 19.742) LRANGE_500 (first 500 elements): rps=13282.9 (overall: 12289.8) avg_msec=15.585 (overall: 19.588) LRANGE_500 (first 500 elements): rps=12011.9 (overall: 12280.5) avg_msec=19.614 (overall: 19.589) LRANGE_500 (first 500 elements): rps=11862.1 (overall: 12266.7) avg_msec=19.422 (overall: 19.584) LRANGE_500 (first 500 elements): rps=12836.7 (overall: 12284.3) avg_msec=16.442 (overall: 19.482) ====== LRANGE_500 (first 500 elements) ====== 221s 100000 requests completed in 8.14 seconds 221s 50 parallel clients 221s 3 bytes payload 221s keep alive: 1 221s host configuration "save": 3600 1 300 100 60 10000 221s host configuration "appendonly": no 221s multi-thread: no 221s 221s Latency by percentile distribution: 221s 0.000% <= 0.983 milliseconds (cumulative count 10) 221s 50.000% <= 18.431 milliseconds (cumulative count 50050) 221s 75.000% <= 24.751 milliseconds (cumulative count 75060) 221s 87.500% <= 29.919 milliseconds (cumulative count 87500) 221s 93.750% <= 33.311 milliseconds (cumulative count 93790) 221s 96.875% <= 37.183 milliseconds (cumulative count 96910) 221s 98.438% <= 40.767 milliseconds (cumulative count 98440) 221s 99.219% <= 47.103 milliseconds (cumulative count 99220) 221s 99.609% <= 74.623 milliseconds (cumulative count 99610) 221s 99.805% <= 79.295 milliseconds (cumulative count 99810) 221s 99.902% <= 93.247 milliseconds (cumulative count 99910) 221s 99.951% <= 94.783 milliseconds (cumulative count 99960) 221s 99.976% <= 95.551 milliseconds (cumulative count 99980) 221s 99.988% <= 95.807 milliseconds (cumulative count 99990) 221s 99.994% <= 96.063 milliseconds (cumulative count 100000) 221s 100.000% <= 96.063 milliseconds (cumulative count 100000) 221s 221s Cumulative distribution of latencies: 221s 0.000% <= 0.103 milliseconds (cumulative count 0) 221s 0.010% <= 1.007 milliseconds (cumulative count 10) 221s 0.030% <= 1.207 milliseconds (cumulative count 30) 221s 0.090% <= 1.407 milliseconds (cumulative count 90) 221s 0.120% <= 1.503 milliseconds (cumulative count 120) 221s 0.240% <= 1.607 milliseconds (cumulative count 240) 221s 0.270% <= 1.703 milliseconds (cumulative count 270) 221s 0.460% <= 1.807 milliseconds (cumulative count 460) 221s 0.570% <= 1.903 milliseconds (cumulative count 570) 221s 0.740% <= 2.007 milliseconds (cumulative count 740) 221s 0.910% <= 2.103 milliseconds (cumulative count 910) 221s 3.060% <= 3.103 milliseconds (cumulative count 3060) 221s 4.200% <= 4.103 milliseconds (cumulative count 4200) 221s 5.090% <= 5.103 milliseconds (cumulative count 5090) 221s 5.910% <= 6.103 milliseconds (cumulative count 5910) 221s 6.870% <= 7.103 milliseconds (cumulative count 6870) 221s 8.490% <= 8.103 milliseconds (cumulative count 8490) 221s 10.610% <= 9.103 milliseconds (cumulative count 10610) 221s 13.320% <= 10.103 milliseconds (cumulative count 13320) 221s 16.610% <= 11.103 milliseconds (cumulative count 16610) 221s 20.380% <= 12.103 milliseconds (cumulative count 20380) 221s 24.640% <= 13.103 milliseconds (cumulative count 24640) 221s 28.650% <= 14.103 milliseconds (cumulative count 28650) 221s 33.220% <= 15.103 milliseconds (cumulative count 33220) 221s 38.030% <= 16.103 milliseconds (cumulative count 38030) 221s 43.360% <= 17.103 milliseconds (cumulative count 43360) 221s 48.610% <= 18.111 milliseconds (cumulative count 48610) 221s 53.190% <= 19.103 milliseconds (cumulative count 53190) 221s 58.060% <= 20.111 milliseconds (cumulative count 58060) 221s 62.380% <= 21.103 milliseconds (cumulative count 62380) 221s 66.380% <= 22.111 milliseconds (cumulative count 66380) 221s 70.060% <= 23.103 milliseconds (cumulative count 70060) 221s 73.290% <= 24.111 milliseconds (cumulative count 73290) 221s 75.960% <= 25.103 milliseconds (cumulative count 75960) 221s 78.330% <= 26.111 milliseconds (cumulative count 78330) 221s 80.700% <= 27.103 milliseconds (cumulative count 80700) 221s 83.180% <= 28.111 milliseconds (cumulative count 83180) 221s 85.490% <= 29.103 milliseconds (cumulative count 85490) 221s 87.930% <= 30.111 milliseconds (cumulative count 87930) 221s 90.260% <= 31.103 milliseconds (cumulative count 90260) 221s 92.100% <= 32.111 milliseconds (cumulative count 92100) 221s 93.550% <= 33.119 milliseconds (cumulative count 93550) 221s 94.520% <= 34.111 milliseconds (cumulative count 94520) 221s 95.240% <= 35.103 milliseconds (cumulative count 95240) 221s 96.090% <= 36.127 milliseconds (cumulative count 96090) 221s 96.870% <= 37.119 milliseconds (cumulative count 96870) 221s 97.530% <= 38.111 milliseconds (cumulative count 97530) 221s 97.980% <= 39.103 milliseconds (cumulative count 97980) 221s 98.300% <= 40.127 milliseconds (cumulative count 98300) 221s 98.510% <= 41.119 milliseconds (cumulative count 98510) 221s 98.750% <= 42.111 milliseconds (cumulative count 98750) 221s 98.920% <= 43.103 milliseconds (cumulative count 98920) 221s 99.050% <= 44.127 milliseconds (cumulative count 99050) 221s 99.150% <= 45.119 milliseconds (cumulative count 99150) 221s 99.220% <= 47.103 milliseconds (cumulative count 99220) 221s 99.320% <= 48.127 milliseconds (cumulative count 99320) 221s 99.360% <= 49.119 milliseconds (cumulative count 99360) 221s 99.370% <= 50.111 milliseconds (cumulative count 99370) 221s 99.380% <= 53.119 milliseconds (cumulative count 99380) 221s 99.390% <= 54.111 milliseconds (cumulative count 99390) 221s 99.400% <= 56.127 milliseconds (cumulative count 99400) 221s 99.430% <= 57.119 milliseconds (cumulative count 99430) 221s 99.450% <= 58.111 milliseconds (cumulative count 99450) 221s 99.460% <= 59.103 milliseconds (cumulative count 99460) 221s 99.470% <= 61.119 milliseconds (cumulative count 99470) 221s 99.500% <= 62.111 milliseconds (cumulative count 99500) 221s 99.510% <= 69.119 milliseconds (cumulative count 99510) 221s 99.540% <= 70.143 milliseconds (cumulative count 99540) 221s 99.570% <= 73.151 milliseconds (cumulative count 99570) 221s 99.590% <= 74.111 milliseconds (cumulative count 99590) 221s 99.620% <= 75.135 milliseconds (cumulative count 99620) 221s 99.670% <= 76.159 milliseconds (cumulative count 99670) 221s 99.710% <= 77.119 milliseconds (cumulative count 99710) 221s 99.760% <= 78.143 milliseconds (cumulative count 99760) 221s 99.800% <= 79.103 milliseconds (cumulative count 99800) 221s 99.820% <= 80.127 milliseconds (cumulative count 99820) 221s 99.830% <= 91.135 milliseconds (cumulative count 99830) 221s 99.870% <= 92.159 milliseconds (cumulative count 99870) 221s 99.900% <= 93.119 milliseconds (cumulative count 99900) 221s 99.940% <= 94.143 milliseconds (cumulative count 99940) 221s 99.970% <= 95.103 milliseconds (cumulative count 99970) 221s 100.000% <= 96.127 milliseconds (cumulative count 100000) 221s 221s Summary: 221s throughput summary: 12278.98 requests per second 221s latency summary (msec): 221s avg min p50 p95 p99 max 221s 19.471 0.976 18.431 34.751 43.743 96.063 230s LRANGE_600 (first 600 elements): rps=6901.6 (overall: 7932.1) avg_msec=25.300 (overall: 25.300) LRANGE_600 (first 600 elements): rps=9146.2 (overall: 8580.2) avg_msec=23.022 (overall: 24.004) LRANGE_600 (first 600 elements): rps=10245.1 (overall: 9159.6) avg_msec=19.530 (overall: 22.263) LRANGE_600 (first 600 elements): rps=10816.7 (overall: 9584.9) avg_msec=20.127 (overall: 21.644) LRANGE_600 (first 600 elements): rps=9884.5 (overall: 9646.1) avg_msec=24.673 (overall: 22.278) LRANGE_600 (first 600 elements): rps=9103.6 (overall: 9554.1) avg_msec=25.492 (overall: 22.797) LRANGE_600 (first 600 elements): rps=10778.7 (overall: 9732.8) avg_msec=19.368 (overall: 22.243) LRANGE_600 (first 600 elements): rps=10384.0 (overall: 9814.9) avg_msec=22.662 (overall: 22.299) LRANGE_600 (first 600 elements): rps=11223.1 (overall: 9973.1) avg_msec=20.813 (overall: 22.111) LRANGE_600 (first 600 elements): rps=8972.2 (overall: 9871.7) avg_msec=26.520 (overall: 22.517) LRANGE_600 (first 600 elements): rps=7980.6 (overall: 9693.9) avg_msec=26.110 (overall: 22.795) LRANGE_600 (first 600 elements): rps=10169.3 (overall: 9734.2) avg_msec=23.347 (overall: 22.844) LRANGE_600 (first 600 elements): rps=12133.3 (overall: 9922.2) avg_msec=19.913 (overall: 22.563) LRANGE_600 (first 600 elements): rps=9416.7 (overall: 9885.9) avg_msec=24.920 (overall: 22.724) LRANGE_600 (first 600 elements): rps=11078.7 (overall: 9966.5) avg_msec=21.800 (overall: 22.655) LRANGE_600 (first 600 elements): rps=12424.0 (overall: 10119.7) avg_msec=16.688 (overall: 22.198) LRANGE_600 (first 600 elements): rps=9584.3 (overall: 10087.7) avg_msec=25.381 (overall: 22.379) LRANGE_600 (first 600 elements): rps=10796.9 (overall: 10127.9) avg_msec=21.369 (overall: 22.318) LRANGE_600 (first 600 elements): rps=10540.5 (overall: 10150.2) avg_msec=22.423 (overall: 22.324) LRANGE_600 (first 600 elements): rps=11345.1 (overall: 10210.8) avg_msec=21.441 (overall: 22.274) LRANGE_600 (first 600 elements): rps=13521.7 (overall: 10369.2) avg_msec=14.868 (overall: 21.812) LRANGE_600 (first 600 elements): rps=12007.8 (overall: 10444.6) avg_msec=16.450 (overall: 21.528) LRANGE_600 (first 600 elements): rps=11785.7 (overall: 10502.9) avg_msec=19.015 (overall: 21.406) LRANGE_600 (first 600 elements): rps=10379.4 (overall: 10497.8) avg_msec=20.857 (overall: 21.383) LRANGE_600 (first 600 elements): rps=11284.6 (overall: 10529.4) avg_msec=21.122 (overall: 21.372) LRANGE_600 (first 600 elements): rps=13458.2 (overall: 10641.6) avg_msec=12.839 (overall: 20.958) LRANGE_600 (first 600 elements): rps=12835.9 (overall: 10724.1) avg_msec=15.642 (overall: 20.719) LRANGE_600 (first 600 elements): rps=13352.0 (overall: 10817.2) avg_msec=13.376 (overall: 20.398) LRANGE_600 (first 600 elements): rps=10565.2 (overall: 10808.5) avg_msec=20.333 (overall: 20.396) LRANGE_600 (first 600 elements): rps=12837.9 (overall: 10876.4) avg_msec=14.862 (overall: 20.177) LRANGE_600 (first 600 elements): rps=13960.6 (overall: 10976.6) avg_msec=13.411 (overall: 19.898) LRANGE_600 (first 600 elements): rps=12135.5 (overall: 11012.6) avg_msec=18.677 (overall: 19.856) LRANGE_600 (first 600 elements): rps=12824.2 (overall: 11068.4) avg_msec=18.109 (overall: 19.794) LRANGE_600 (first 600 elements): rps=12569.2 (overall: 11112.6) avg_msec=16.705 (overall: 19.690) LRANGE_600 (first 600 elements): rps=12565.7 (overall: 11153.9) avg_msec=18.652 (overall: 19.657) ====== LRANGE_600 (first 600 elements) ====== 230s 100000 requests completed in 8.95 seconds 230s 50 parallel clients 230s 3 bytes payload 230s keep alive: 1 230s host configuration "save": 3600 1 300 100 60 10000 230s host configuration "appendonly": no 230s multi-thread: no 230s 230s Latency by percentile distribution: 230s 0.000% <= 0.887 milliseconds (cumulative count 10) 230s 50.000% <= 18.799 milliseconds (cumulative count 50000) 230s 75.000% <= 25.967 milliseconds (cumulative count 75050) 230s 87.500% <= 31.087 milliseconds (cumulative count 87500) 230s 93.750% <= 34.111 milliseconds (cumulative count 93830) 230s 96.875% <= 36.479 milliseconds (cumulative count 96890) 230s 98.438% <= 38.303 milliseconds (cumulative count 98440) 230s 99.219% <= 41.791 milliseconds (cumulative count 99220) 230s 99.609% <= 45.375 milliseconds (cumulative count 99610) 230s 99.805% <= 72.319 milliseconds (cumulative count 99810) 230s 99.902% <= 76.479 milliseconds (cumulative count 99910) 230s 99.951% <= 77.503 milliseconds (cumulative count 99960) 230s 99.976% <= 77.951 milliseconds (cumulative count 99980) 230s 99.988% <= 78.143 milliseconds (cumulative count 99990) 230s 99.994% <= 78.399 milliseconds (cumulative count 100000) 230s 100.000% <= 78.399 milliseconds (cumulative count 100000) 230s 230s Cumulative distribution of latencies: 230s 0.000% <= 0.103 milliseconds (cumulative count 0) 230s 0.010% <= 0.903 milliseconds (cumulative count 10) 230s 0.020% <= 1.007 milliseconds (cumulative count 20) 230s 0.030% <= 1.207 milliseconds (cumulative count 30) 230s 0.070% <= 1.407 milliseconds (cumulative count 70) 230s 0.140% <= 1.503 milliseconds (cumulative count 140) 230s 0.220% <= 1.607 milliseconds (cumulative count 220) 230s 0.350% <= 1.703 milliseconds (cumulative count 350) 230s 0.590% <= 1.807 milliseconds (cumulative count 590) 230s 0.850% <= 1.903 milliseconds (cumulative count 850) 230s 1.030% <= 2.007 milliseconds (cumulative count 1030) 230s 1.200% <= 2.103 milliseconds (cumulative count 1200) 230s 2.850% <= 3.103 milliseconds (cumulative count 2850) 230s 3.650% <= 4.103 milliseconds (cumulative count 3650) 230s 4.340% <= 5.103 milliseconds (cumulative count 4340) 230s 5.020% <= 6.103 milliseconds (cumulative count 5020) 230s 5.940% <= 7.103 milliseconds (cumulative count 5940) 230s 7.680% <= 8.103 milliseconds (cumulative count 7680) 230s 10.250% <= 9.103 milliseconds (cumulative count 10250) 230s 13.660% <= 10.103 milliseconds (cumulative count 13660) 230s 17.940% <= 11.103 milliseconds (cumulative count 17940) 230s 22.960% <= 12.103 milliseconds (cumulative count 22960) 230s 27.680% <= 13.103 milliseconds (cumulative count 27680) 230s 32.130% <= 14.103 milliseconds (cumulative count 32130) 230s 36.560% <= 15.103 milliseconds (cumulative count 36560) 230s 40.700% <= 16.103 milliseconds (cumulative count 40700) 230s 44.380% <= 17.103 milliseconds (cumulative count 44380) 230s 47.840% <= 18.111 milliseconds (cumulative count 47840) 230s 50.850% <= 19.103 milliseconds (cumulative count 50850) 230s 54.220% <= 20.111 milliseconds (cumulative count 54220) 230s 57.880% <= 21.103 milliseconds (cumulative count 57880) 230s 61.630% <= 22.111 milliseconds (cumulative count 61630) 230s 65.410% <= 23.103 milliseconds (cumulative count 65410) 230s 69.170% <= 24.111 milliseconds (cumulative count 69170) 230s 72.380% <= 25.103 milliseconds (cumulative count 72380) 230s 75.470% <= 26.111 milliseconds (cumulative count 75470) 230s 78.480% <= 27.103 milliseconds (cumulative count 78480) 230s 81.200% <= 28.111 milliseconds (cumulative count 81200) 230s 83.520% <= 29.103 milliseconds (cumulative count 83520) 230s 85.480% <= 30.111 milliseconds (cumulative count 85480) 230s 87.540% <= 31.103 milliseconds (cumulative count 87540) 230s 90.000% <= 32.111 milliseconds (cumulative count 90000) 230s 92.090% <= 33.119 milliseconds (cumulative count 92090) 230s 93.830% <= 34.111 milliseconds (cumulative count 93830) 230s 95.340% <= 35.103 milliseconds (cumulative count 95340) 230s 96.550% <= 36.127 milliseconds (cumulative count 96550) 230s 97.440% <= 37.119 milliseconds (cumulative count 97440) 230s 98.290% <= 38.111 milliseconds (cumulative count 98290) 230s 98.790% <= 39.103 milliseconds (cumulative count 98790) 230s 98.960% <= 40.127 milliseconds (cumulative count 98960) 230s 99.080% <= 41.119 milliseconds (cumulative count 99080) 230s 99.270% <= 42.111 milliseconds (cumulative count 99270) 230s 99.370% <= 43.103 milliseconds (cumulative count 99370) 230s 99.500% <= 44.127 milliseconds (cumulative count 99500) 230s 99.580% <= 45.119 milliseconds (cumulative count 99580) 230s 99.670% <= 46.111 milliseconds (cumulative count 99670) 230s 99.710% <= 47.103 milliseconds (cumulative count 99710) 230s 99.730% <= 48.127 milliseconds (cumulative count 99730) 230s 99.740% <= 61.119 milliseconds (cumulative count 99740) 230s 99.750% <= 62.111 milliseconds (cumulative count 99750) 230s 99.760% <= 64.127 milliseconds (cumulative count 99760) 230s 99.770% <= 67.135 milliseconds (cumulative count 99770) 230s 99.780% <= 68.159 milliseconds (cumulative count 99780) 230s 99.790% <= 70.143 milliseconds (cumulative count 99790) 230s 99.800% <= 71.103 milliseconds (cumulative count 99800) 230s 99.810% <= 73.151 milliseconds (cumulative count 99810) 230s 99.820% <= 74.111 milliseconds (cumulative count 99820) 230s 99.850% <= 75.135 milliseconds (cumulative count 99850) 230s 99.890% <= 76.159 milliseconds (cumulative count 99890) 230s 99.940% <= 77.119 milliseconds (cumulative count 99940) 230s 99.990% <= 78.143 milliseconds (cumulative count 99990) 230s 100.000% <= 79.103 milliseconds (cumulative count 100000) 230s 230s Summary: 230s throughput summary: 11169.44 requests per second 230s latency summary (msec): 230s avg min p50 p95 p99 max 230s 19.624 0.880 18.799 34.911 40.415 78.399 231s MSET (10 keys): rps=85936.3 (overall: 173951.6) avg_msec=2.658 (overall: 2.658) MSET (10 keys): rps=187729.1 (overall: 183173.3) avg_msec=2.468 (overall: 2.528) ====== MSET (10 keys) ====== 231s 100000 requests completed in 0.54 seconds 231s 50 parallel clients 231s 3 bytes payload 231s keep alive: 1 231s host configuration "save": 3600 1 300 100 60 10000 231s host configuration "appendonly": no 231s multi-thread: no 231s 231s Latency by percentile distribution: 231s 0.000% <= 0.399 milliseconds (cumulative count 10) 231s 50.000% <= 2.535 milliseconds (cumulative count 50410) 231s 75.000% <= 2.767 milliseconds (cumulative count 75200) 231s 87.500% <= 2.911 milliseconds (cumulative count 87640) 231s 93.750% <= 3.079 milliseconds (cumulative count 93890) 231s 96.875% <= 4.751 milliseconds (cumulative count 96880) 231s 98.438% <= 5.559 milliseconds (cumulative count 98440) 231s 99.219% <= 6.127 milliseconds (cumulative count 99250) 231s 99.609% <= 6.351 milliseconds (cumulative count 99640) 231s 99.805% <= 6.495 milliseconds (cumulative count 99820) 231s 99.902% <= 6.655 milliseconds (cumulative count 99910) 231s 99.951% <= 6.807 milliseconds (cumulative count 99960) 231s 99.976% <= 6.895 milliseconds (cumulative count 99980) 231s 99.988% <= 6.927 milliseconds (cumulative count 99990) 231s 99.994% <= 6.959 milliseconds (cumulative count 100000) 231s 100.000% <= 6.959 milliseconds (cumulative count 100000) 231s 231s Cumulative distribution of latencies: 231s 0.000% <= 0.103 milliseconds (cumulative count 0) 231s 0.020% <= 0.407 milliseconds (cumulative count 20) 231s 0.170% <= 0.503 milliseconds (cumulative count 170) 231s 0.210% <= 0.607 milliseconds (cumulative count 210) 231s 0.350% <= 0.703 milliseconds (cumulative count 350) 231s 0.490% <= 0.807 milliseconds (cumulative count 490) 231s 0.610% <= 0.903 milliseconds (cumulative count 610) 231s 0.810% <= 1.007 milliseconds (cumulative count 810) 231s 1.000% <= 1.103 milliseconds (cumulative count 1000) 231s 1.160% <= 1.207 milliseconds (cumulative count 1160) 231s 1.390% <= 1.303 milliseconds (cumulative count 1390) 231s 1.960% <= 1.407 milliseconds (cumulative count 1960) 231s 3.390% <= 1.503 milliseconds (cumulative count 3390) 231s 7.640% <= 1.607 milliseconds (cumulative count 7640) 231s 12.180% <= 1.703 milliseconds (cumulative count 12180) 231s 14.560% <= 1.807 milliseconds (cumulative count 14560) 231s 15.460% <= 1.903 milliseconds (cumulative count 15460) 231s 16.810% <= 2.007 milliseconds (cumulative count 16810) 231s 19.680% <= 2.103 milliseconds (cumulative count 19680) 231s 94.250% <= 3.103 milliseconds (cumulative count 94250) 231s 96.250% <= 4.103 milliseconds (cumulative count 96250) 231s 97.970% <= 5.103 milliseconds (cumulative count 97970) 231s 99.190% <= 6.103 milliseconds (cumulative count 99190) 231s 100.000% <= 7.103 milliseconds (cumulative count 100000) 231s 231s Summary: 231s throughput summary: 184162.06 requests per second 231s latency summary (msec): 231s avg min p50 p95 p99 max 231s 2.532 0.392 2.535 3.207 6.031 6.959 231s XADD: rps=91502.0 (overall: 282317.1) avg_msec=1.618 (overall: 1.618) XADD: rps=293800.0 (overall: 290963.9) avg_msec=1.557 (overall: 1.572) ====== XADD ====== 231s 100000 requests completed in 0.34 seconds 231s 50 parallel clients 231s 3 bytes payload 231s keep alive: 1 231s host configuration "save": 3600 1 300 100 60 10000 231s host configuration "appendonly": no 231s multi-thread: no 231s 231s Latency by percentile distribution: 231s 0.000% <= 0.431 milliseconds (cumulative count 10) 231s 50.000% <= 1.575 milliseconds (cumulative count 50370) 231s 75.000% <= 1.767 milliseconds (cumulative count 75360) 231s 87.500% <= 1.887 milliseconds (cumulative count 87520) 231s 93.750% <= 1.983 milliseconds (cumulative count 93870) 231s 96.875% <= 2.071 milliseconds (cumulative count 96940) 231s 98.438% <= 2.151 milliseconds (cumulative count 98490) 231s 99.219% <= 2.263 milliseconds (cumulative count 99270) 231s 99.609% <= 4.815 milliseconds (cumulative count 99610) 231s 99.805% <= 5.359 milliseconds (cumulative count 99810) 231s 99.902% <= 5.519 milliseconds (cumulative count 99910) 231s 99.951% <= 5.623 milliseconds (cumulative count 99960) 231s 99.976% <= 5.815 milliseconds (cumulative count 99980) 231s 99.988% <= 5.839 milliseconds (cumulative count 99990) 231s 99.994% <= 5.911 milliseconds (cumulative count 100000) 231s 100.000% <= 5.911 milliseconds (cumulative count 100000) 231s 231s Cumulative distribution of latencies: 231s 0.000% <= 0.103 milliseconds (cumulative count 0) 231s 0.060% <= 0.503 milliseconds (cumulative count 60) 231s 0.200% <= 0.607 milliseconds (cumulative count 200) 231s 0.260% <= 0.703 milliseconds (cumulative count 260) 231s 0.320% <= 0.807 milliseconds (cumulative count 320) 231s 0.800% <= 0.903 milliseconds (cumulative count 800) 231s 2.420% <= 1.007 milliseconds (cumulative count 2420) 231s 6.490% <= 1.103 milliseconds (cumulative count 6490) 231s 15.380% <= 1.207 milliseconds (cumulative count 15380) 231s 22.100% <= 1.303 milliseconds (cumulative count 22100) 231s 31.560% <= 1.407 milliseconds (cumulative count 31560) 231s 41.730% <= 1.503 milliseconds (cumulative count 41730) 231s 54.400% <= 1.607 milliseconds (cumulative count 54400) 231s 67.030% <= 1.703 milliseconds (cumulative count 67030) 231s 79.980% <= 1.807 milliseconds (cumulative count 79980) 231s 88.770% <= 1.903 milliseconds (cumulative count 88770) 231s 94.870% <= 2.007 milliseconds (cumulative count 94870) 231s 97.670% <= 2.103 milliseconds (cumulative count 97670) 231s 99.500% <= 3.103 milliseconds (cumulative count 99500) 231s 99.790% <= 5.103 milliseconds (cumulative count 99790) 231s 100.000% <= 6.103 milliseconds (cumulative count 100000) 231s 231s Summary: 231s throughput summary: 291545.19 requests per second 231s latency summary (msec): 231s avg min p50 p95 p99 max 231s 1.567 0.424 1.575 2.015 2.223 5.911 231s 231s autopkgtest [03:06:06]: test 0002-benchmark: -----------------------] 232s 0002-benchmark PASS 232s autopkgtest [03:06:07]: test 0002-benchmark: - - - - - - - - - - results - - - - - - - - - - 232s autopkgtest [03:06:07]: test 0003-valkey-check-aof: preparing testbed 233s Reading package lists... 233s Building dependency tree... 233s Reading state information... 233s Starting pkgProblemResolver with broken count: 0 233s Starting 2 pkgProblemResolver with broken count: 0 233s Done 233s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 234s autopkgtest [03:06:09]: test 0003-valkey-check-aof: [----------------------- 235s autopkgtest [03:06:10]: test 0003-valkey-check-aof: -----------------------] 235s 0003-valkey-check-aof PASS 235s autopkgtest [03:06:10]: test 0003-valkey-check-aof: - - - - - - - - - - results - - - - - - - - - - 236s autopkgtest [03:06:11]: test 0004-valkey-check-rdb: preparing testbed 236s Reading package lists... 236s Building dependency tree... 236s Reading state information... 236s Starting pkgProblemResolver with broken count: 0 236s Starting 2 pkgProblemResolver with broken count: 0 236s Done 237s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 237s autopkgtest [03:06:12]: test 0004-valkey-check-rdb: [----------------------- 243s OK 243s [offset 0] Checking RDB file /var/lib/valkey/dump.rdb 243s [offset 27] AUX FIELD valkey-ver = '7.2.7' 243s [offset 41] AUX FIELD redis-bits = '64' 243s [offset 53] AUX FIELD ctime = '1740711978' 243s [offset 68] AUX FIELD used-mem = '3057368' 243s [offset 80] AUX FIELD aof-base = '0' 243s [offset 82] Selecting DB ID 0 243s [offset 566261] Checksum OK 243s [offset 566261] \o/ RDB looks OK! \o/ 243s [info] 5 keys read 243s [info] 0 expires 243s [info] 0 already expired 243s autopkgtest [03:06:18]: test 0004-valkey-check-rdb: -----------------------] 244s autopkgtest [03:06:19]: test 0004-valkey-check-rdb: - - - - - - - - - - results - - - - - - - - - - 244s 0004-valkey-check-rdb PASS 244s autopkgtest [03:06:19]: test 0005-cjson: preparing testbed 244s Reading package lists... 245s Building dependency tree... 245s Reading state information... 245s Starting pkgProblemResolver with broken count: 0 245s Starting 2 pkgProblemResolver with broken count: 0 245s Done 246s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 247s autopkgtest [03:06:22]: test 0005-cjson: [----------------------- 252s 253s autopkgtest [03:06:28]: test 0005-cjson: -----------------------] 253s 0005-cjson PASS 253s autopkgtest [03:06:28]: test 0005-cjson: - - - - - - - - - - results - - - - - - - - - - 253s autopkgtest [03:06:28]: test 0006-migrate-from-redis: preparing testbed 329s autopkgtest [03:07:44]: testbed dpkg architecture: ppc64el 329s autopkgtest [03:07:44]: testbed apt version: 2.7.14build2 330s autopkgtest [03:07:45]: @@@@@@@@@@@@@@@@@@@@ test bed setup 330s autopkgtest [03:07:45]: testbed release detected to be: noble 331s autopkgtest [03:07:46]: updating testbed package index (apt update) 331s Get:1 http://ftpmaster.internal/ubuntu noble-proposed InRelease [265 kB] 331s Hit:2 http://ftpmaster.internal/ubuntu noble InRelease 331s Hit:3 http://ftpmaster.internal/ubuntu noble-updates InRelease 332s Hit:4 http://ftpmaster.internal/ubuntu noble-security InRelease 332s Get:5 http://ftpmaster.internal/ubuntu noble-proposed/universe Sources [66.2 kB] 332s Get:6 http://ftpmaster.internal/ubuntu noble-proposed/multiverse Sources [9488 B] 332s Get:7 http://ftpmaster.internal/ubuntu noble-proposed/main Sources [61.6 kB] 332s Get:8 http://ftpmaster.internal/ubuntu noble-proposed/restricted Sources [18.6 kB] 332s Get:9 http://ftpmaster.internal/ubuntu noble-proposed/main ppc64el Packages [88.2 kB] 332s Get:10 http://ftpmaster.internal/ubuntu noble-proposed/main ppc64el c-n-f Metadata [3752 B] 332s Get:11 http://ftpmaster.internal/ubuntu noble-proposed/restricted ppc64el Packages [1380 B] 332s Get:12 http://ftpmaster.internal/ubuntu noble-proposed/restricted ppc64el c-n-f Metadata [116 B] 332s Get:13 http://ftpmaster.internal/ubuntu noble-proposed/universe ppc64el Packages [416 kB] 332s Get:14 http://ftpmaster.internal/ubuntu noble-proposed/universe ppc64el c-n-f Metadata [9704 B] 332s Get:15 http://ftpmaster.internal/ubuntu noble-proposed/multiverse ppc64el Packages [968 B] 332s Get:16 http://ftpmaster.internal/ubuntu noble-proposed/multiverse ppc64el c-n-f Metadata [116 B] 337s Fetched 941 kB in 1s (824 kB/s) 338s Reading package lists... 339s Reading package lists... 339s Building dependency tree... 339s Reading state information... 339s Calculating upgrade... 339s The following packages will be upgraded: 339s cloud-init cryptsetup-bin libcryptsetup12 339s 3 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 339s Need to get 1206 kB of archives. 339s After this operation, 13.3 kB of additional disk space will be used. 339s Get:1 http://ftpmaster.internal/ubuntu noble-proposed/main ppc64el libcryptsetup12 ppc64el 2:2.7.0-1ubuntu4.2 [375 kB] 340s Get:2 http://ftpmaster.internal/ubuntu noble-proposed/main ppc64el cryptsetup-bin ppc64el 2:2.7.0-1ubuntu4.2 [227 kB] 340s Get:3 http://ftpmaster.internal/ubuntu noble-proposed/main ppc64el cloud-init all 24.4.1-0ubuntu0~24.04.1 [604 kB] 340s Preconfiguring packages ... 340s Fetched 1206 kB in 1s (1471 kB/s) 341s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 72680 files and directories currently installed.) 341s Preparing to unpack .../libcryptsetup12_2%3a2.7.0-1ubuntu4.2_ppc64el.deb ... 341s Unpacking libcryptsetup12:ppc64el (2:2.7.0-1ubuntu4.2) over (2:2.7.0-1ubuntu4.1) ... 341s Preparing to unpack .../cryptsetup-bin_2%3a2.7.0-1ubuntu4.2_ppc64el.deb ... 341s Unpacking cryptsetup-bin (2:2.7.0-1ubuntu4.2) over (2:2.7.0-1ubuntu4.1) ... 341s Preparing to unpack .../cloud-init_24.4.1-0ubuntu0~24.04.1_all.deb ... 341s Unpacking cloud-init (24.4.1-0ubuntu0~24.04.1) over (24.4-0ubuntu1~24.04.2) ... 341s Setting up cloud-init (24.4.1-0ubuntu0~24.04.1) ... 343s Setting up libcryptsetup12:ppc64el (2:2.7.0-1ubuntu4.2) ... 343s Setting up cryptsetup-bin (2:2.7.0-1ubuntu4.2) ... 343s Processing triggers for rsyslog (8.2312.0-3ubuntu9) ... 343s Processing triggers for man-db (2.12.0-4build2) ... 344s Processing triggers for libc-bin (2.39-0ubuntu8.4) ... 344s Reading package lists... 344s Building dependency tree... 344s Reading state information... 344s 0 upgraded, 0 newly installed, 0 to remove and 11 not upgraded. 344s autopkgtest [03:07:59]: upgrading testbed (apt dist-upgrade and autopurge) 345s Reading package lists... 345s Building dependency tree... 345s Reading state information... 345s Calculating upgrade...Starting pkgProblemResolver with broken count: 0 345s Starting 2 pkgProblemResolver with broken count: 0 345s Done 345s Entering ResolveByKeep 346s 346s The following packages will be upgraded: 346s libnss-systemd libpam-systemd libsystemd-shared libsystemd0 libudev1 systemd 346s systemd-dev systemd-resolved systemd-sysv systemd-timesyncd udev 346s 11 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 346s Need to get 9885 kB of archives. 346s After this operation, 0 B of additional disk space will be used. 346s Get:1 http://ftpmaster.internal/ubuntu noble-proposed/main ppc64el libnss-systemd ppc64el 255.4-1ubuntu8.6 [207 kB] 346s Get:2 http://ftpmaster.internal/ubuntu noble-proposed/main ppc64el systemd-dev all 255.4-1ubuntu8.6 [104 kB] 346s Get:3 http://ftpmaster.internal/ubuntu noble-proposed/main ppc64el systemd-timesyncd ppc64el 255.4-1ubuntu8.6 [37.6 kB] 346s Get:4 http://ftpmaster.internal/ubuntu noble-proposed/main ppc64el systemd-resolved ppc64el 255.4-1ubuntu8.6 [345 kB] 346s Get:5 http://ftpmaster.internal/ubuntu noble-proposed/main ppc64el libsystemd-shared ppc64el 255.4-1ubuntu8.6 [2346 kB] 347s Get:6 http://ftpmaster.internal/ubuntu noble-proposed/main ppc64el libsystemd0 ppc64el 255.4-1ubuntu8.6 [526 kB] 347s Get:7 http://ftpmaster.internal/ubuntu noble-proposed/main ppc64el systemd-sysv ppc64el 255.4-1ubuntu8.6 [11.9 kB] 347s Get:8 http://ftpmaster.internal/ubuntu noble-proposed/main ppc64el libpam-systemd ppc64el 255.4-1ubuntu8.6 [303 kB] 347s Get:9 http://ftpmaster.internal/ubuntu noble-proposed/main ppc64el systemd ppc64el 255.4-1ubuntu8.6 [3767 kB] 348s Get:10 http://ftpmaster.internal/ubuntu noble-proposed/main ppc64el udev ppc64el 255.4-1ubuntu8.6 [2036 kB] 348s Get:11 http://ftpmaster.internal/ubuntu noble-proposed/main ppc64el libudev1 ppc64el 255.4-1ubuntu8.6 [200 kB] 348s Fetched 9885 kB in 2s (4008 kB/s) 349s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 72680 files and directories currently installed.) 349s Preparing to unpack .../0-libnss-systemd_255.4-1ubuntu8.6_ppc64el.deb ... 349s Unpacking libnss-systemd:ppc64el (255.4-1ubuntu8.6) over (255.4-1ubuntu8.5) ... 349s Preparing to unpack .../1-systemd-dev_255.4-1ubuntu8.6_all.deb ... 349s Unpacking systemd-dev (255.4-1ubuntu8.6) over (255.4-1ubuntu8.5) ... 349s Preparing to unpack .../2-systemd-timesyncd_255.4-1ubuntu8.6_ppc64el.deb ... 349s Unpacking systemd-timesyncd (255.4-1ubuntu8.6) over (255.4-1ubuntu8.5) ... 349s Preparing to unpack .../3-systemd-resolved_255.4-1ubuntu8.6_ppc64el.deb ... 349s Unpacking systemd-resolved (255.4-1ubuntu8.6) over (255.4-1ubuntu8.5) ... 349s Preparing to unpack .../4-libsystemd-shared_255.4-1ubuntu8.6_ppc64el.deb ... 349s Unpacking libsystemd-shared:ppc64el (255.4-1ubuntu8.6) over (255.4-1ubuntu8.5) ... 349s Preparing to unpack .../5-libsystemd0_255.4-1ubuntu8.6_ppc64el.deb ... 349s Unpacking libsystemd0:ppc64el (255.4-1ubuntu8.6) over (255.4-1ubuntu8.5) ... 349s Setting up libsystemd0:ppc64el (255.4-1ubuntu8.6) ... 349s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 72680 files and directories currently installed.) 349s Preparing to unpack .../systemd-sysv_255.4-1ubuntu8.6_ppc64el.deb ... 349s Unpacking systemd-sysv (255.4-1ubuntu8.6) over (255.4-1ubuntu8.5) ... 349s Preparing to unpack .../libpam-systemd_255.4-1ubuntu8.6_ppc64el.deb ... 349s Unpacking libpam-systemd:ppc64el (255.4-1ubuntu8.6) over (255.4-1ubuntu8.5) ... 349s Preparing to unpack .../systemd_255.4-1ubuntu8.6_ppc64el.deb ... 349s Unpacking systemd (255.4-1ubuntu8.6) over (255.4-1ubuntu8.5) ... 349s Preparing to unpack .../udev_255.4-1ubuntu8.6_ppc64el.deb ... 349s Unpacking udev (255.4-1ubuntu8.6) over (255.4-1ubuntu8.5) ... 349s Preparing to unpack .../libudev1_255.4-1ubuntu8.6_ppc64el.deb ... 349s Unpacking libudev1:ppc64el (255.4-1ubuntu8.6) over (255.4-1ubuntu8.5) ... 349s Setting up libudev1:ppc64el (255.4-1ubuntu8.6) ... 349s Setting up systemd-dev (255.4-1ubuntu8.6) ... 349s Setting up libsystemd-shared:ppc64el (255.4-1ubuntu8.6) ... 349s Setting up systemd (255.4-1ubuntu8.6) ... 350s Setting up systemd-timesyncd (255.4-1ubuntu8.6) ... 350s Setting up udev (255.4-1ubuntu8.6) ... 351s Setting up systemd-resolved (255.4-1ubuntu8.6) ... 352s Setting up systemd-sysv (255.4-1ubuntu8.6) ... 352s Setting up libnss-systemd:ppc64el (255.4-1ubuntu8.6) ... 352s Setting up libpam-systemd:ppc64el (255.4-1ubuntu8.6) ... 352s Processing triggers for libc-bin (2.39-0ubuntu8.4) ... 352s Processing triggers for man-db (2.12.0-4build2) ... 353s Processing triggers for dbus (1.14.10-4ubuntu4.1) ... 353s Processing triggers for initramfs-tools (0.142ubuntu25.5) ... 353s update-initramfs: Generating /boot/initrd.img-6.8.0-54-generic 353s W: No lz4 in /usr/bin:/sbin:/bin, using gzip 360s Reading package lists... 360s Building dependency tree... 360s Reading state information... 360s Starting pkgProblemResolver with broken count: 0 360s Starting 2 pkgProblemResolver with broken count: 0 360s Done 360s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 361s autopkgtest [03:08:16]: rebooting testbed after setup commands that affected boot 395s autopkgtest-virt-ssh: WARNING: ssh connection failed. Retrying in 3 seconds... 405s Reading package lists... 405s Building dependency tree... 405s Reading state information... 406s Starting pkgProblemResolver with broken count: 0 406s Starting 2 pkgProblemResolver with broken count: 0 406s Done 406s The following NEW packages will be installed: 406s libatomic1 libjemalloc2 liblzf1 redis-sentinel redis-server redis-tools 406s 0 upgraded, 6 newly installed, 0 to remove and 0 not upgraded. 406s Need to get 1807 kB of archives. 406s After this operation, 10.3 MB of additional disk space will be used. 406s Get:1 http://ftpmaster.internal/ubuntu noble-updates/main ppc64el libatomic1 ppc64el 14.2.0-4ubuntu2~24.04 [10.8 kB] 406s Get:2 http://ftpmaster.internal/ubuntu noble/universe ppc64el libjemalloc2 ppc64el 5.3.0-2build1 [259 kB] 406s Get:3 http://ftpmaster.internal/ubuntu noble/universe ppc64el liblzf1 ppc64el 3.6-4 [7920 B] 406s Get:4 http://ftpmaster.internal/ubuntu noble/universe ppc64el redis-tools ppc64el 5:7.0.15-1build2 [1465 kB] 407s Get:5 http://ftpmaster.internal/ubuntu noble/universe ppc64el redis-sentinel ppc64el 5:7.0.15-1build2 [12.2 kB] 407s Get:6 http://ftpmaster.internal/ubuntu noble/universe ppc64el redis-server ppc64el 5:7.0.15-1build2 [51.7 kB] 407s Fetched 1807 kB in 1s (2174 kB/s) 407s Selecting previously unselected package libatomic1:ppc64el. 407s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 72680 files and directories currently installed.) 407s Preparing to unpack .../0-libatomic1_14.2.0-4ubuntu2~24.04_ppc64el.deb ... 407s Unpacking libatomic1:ppc64el (14.2.0-4ubuntu2~24.04) ... 407s Selecting previously unselected package libjemalloc2:ppc64el. 407s Preparing to unpack .../1-libjemalloc2_5.3.0-2build1_ppc64el.deb ... 407s Unpacking libjemalloc2:ppc64el (5.3.0-2build1) ... 407s Selecting previously unselected package liblzf1:ppc64el. 407s Preparing to unpack .../2-liblzf1_3.6-4_ppc64el.deb ... 407s Unpacking liblzf1:ppc64el (3.6-4) ... 407s Selecting previously unselected package redis-tools. 407s Preparing to unpack .../3-redis-tools_5%3a7.0.15-1build2_ppc64el.deb ... 407s Unpacking redis-tools (5:7.0.15-1build2) ... 407s Selecting previously unselected package redis-sentinel. 407s Preparing to unpack .../4-redis-sentinel_5%3a7.0.15-1build2_ppc64el.deb ... 407s Unpacking redis-sentinel (5:7.0.15-1build2) ... 407s Selecting previously unselected package redis-server. 407s Preparing to unpack .../5-redis-server_5%3a7.0.15-1build2_ppc64el.deb ... 407s Unpacking redis-server (5:7.0.15-1build2) ... 407s Setting up libjemalloc2:ppc64el (5.3.0-2build1) ... 407s Setting up liblzf1:ppc64el (3.6-4) ... 407s Setting up libatomic1:ppc64el (14.2.0-4ubuntu2~24.04) ... 407s Setting up redis-tools (5:7.0.15-1build2) ... 407s Setting up redis-server (5:7.0.15-1build2) ... 408s Created symlink /etc/systemd/system/redis.service → /usr/lib/systemd/system/redis-server.service. 408s Created symlink /etc/systemd/system/multi-user.target.wants/redis-server.service → /usr/lib/systemd/system/redis-server.service. 408s Setting up redis-sentinel (5:7.0.15-1build2) ... 409s Created symlink /etc/systemd/system/sentinel.service → /usr/lib/systemd/system/redis-sentinel.service. 409s Created symlink /etc/systemd/system/multi-user.target.wants/redis-sentinel.service → /usr/lib/systemd/system/redis-sentinel.service. 409s Processing triggers for man-db (2.12.0-4build2) ... 410s Processing triggers for libc-bin (2.39-0ubuntu8.4) ... 420s autopkgtest [03:09:15]: test 0006-migrate-from-redis: [----------------------- 420s + FLAG_FILE=/etc/valkey/REDIS_MIGRATION 420s + sed -i 's#loglevel notice#loglevel debug#' /etc/redis/redis.conf 420s + systemctl restart redis-server 421s + redis-cli -h 127.0.0.1 -p 6379 SET test 1 421s OK 421s + redis-cli -h 127.0.0.1 -p 6379 GET test 421s 1 421s + redis-cli -h 127.0.0.1 -p 6379 SAVE 421s OK 421s + sha256sum /var/lib/redis/dump.rdb 421s + apt-get install -y valkey-redis-compat 421s 752815c885a1cf89a3031038a8fc14157c8ad12e1cf374731355bb78db218cb1 /var/lib/redis/dump.rdb 421s Reading package lists... 421s Building dependency tree... 421s Reading state information... 421s The following additional packages will be installed: 421s valkey-server valkey-tools 421s Suggested packages: 421s ruby-redis 421s The following packages will be REMOVED: 421s redis-sentinel redis-server redis-tools 421s The following NEW packages will be installed: 421s valkey-redis-compat valkey-server valkey-tools 421s 0 upgraded, 3 newly installed, 3 to remove and 0 not upgraded. 421s Need to get 1605 kB of archives. 421s After this operation, 170 kB of additional disk space will be used. 421s Get:1 http://ftpmaster.internal/ubuntu noble-updates/universe ppc64el valkey-tools ppc64el 7.2.7+dfsg1-0ubuntu0.24.04.1 [1548 kB] 422s Get:2 http://ftpmaster.internal/ubuntu noble-updates/universe ppc64el valkey-server ppc64el 7.2.7+dfsg1-0ubuntu0.24.04.1 [49.2 kB] 422s Get:3 http://ftpmaster.internal/ubuntu noble-updates/universe ppc64el valkey-redis-compat all 7.2.7+dfsg1-0ubuntu0.24.04.1 [7744 B] 422s Fetched 1605 kB in 1s (1529 kB/s) 422s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 72739 files and directories currently installed.) 422s Removing redis-sentinel (5:7.0.15-1build2) ... 423s Removing redis-server (5:7.0.15-1build2) ... 423s Removing redis-tools (5:7.0.15-1build2) ... 423s Selecting previously unselected package valkey-tools. 423s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 72702 files and directories currently installed.) 423s Preparing to unpack .../valkey-tools_7.2.7+dfsg1-0ubuntu0.24.04.1_ppc64el.deb ... 423s Unpacking valkey-tools (7.2.7+dfsg1-0ubuntu0.24.04.1) ... 423s Selecting previously unselected package valkey-server. 423s Preparing to unpack .../valkey-server_7.2.7+dfsg1-0ubuntu0.24.04.1_ppc64el.deb ... 423s Unpacking valkey-server (7.2.7+dfsg1-0ubuntu0.24.04.1) ... 423s Selecting previously unselected package valkey-redis-compat. 423s Preparing to unpack .../valkey-redis-compat_7.2.7+dfsg1-0ubuntu0.24.04.1_all.deb ... 423s Unpacking valkey-redis-compat (7.2.7+dfsg1-0ubuntu0.24.04.1) ... 423s Setting up valkey-tools (7.2.7+dfsg1-0ubuntu0.24.04.1) ... 424s Setting up valkey-server (7.2.7+dfsg1-0ubuntu0.24.04.1) ... 424s Created symlink /etc/systemd/system/valkey.service → /usr/lib/systemd/system/valkey-server.service. 424s Created symlink /etc/systemd/system/multi-user.target.wants/valkey-server.service → /usr/lib/systemd/system/valkey-server.service. 424s Setting up valkey-redis-compat (7.2.7+dfsg1-0ubuntu0.24.04.1) ... 425s dpkg-query: no packages found matching valkey-sentinel 425s [I] /etc/redis/redis.conf has been copied to /etc/valkey/valkey.conf. Please, review the content of valkey.conf, especially if you had modified redis.conf. 425s [I] /etc/redis/sentinel.conf has been copied to /etc/valkey/sentinel.conf. Please, review the content of sentinel.conf, especially if you had modified sentinel.conf. 425s [I] On-disk redis dumps moved from /var/lib/redis/ to /var/lib/valkey. 425s Processing triggers for man-db (2.12.0-4build2) ... 425s + '[' -f /etc/valkey/REDIS_MIGRATION ']' 425s + sha256sum /var/lib/valkey/dump.rdb 425s edf2164ddc697288ea62479f6a360ec17270b7786ac4af0cea5e557cf5074a35 /var/lib/valkey/dump.rdb 425s + systemctl status valkey-server 425s + grep inactive 425s Active: inactive (dead) since Fri 2025-02-28 03:09:19 UTC; 501ms ago 425s + rm /etc/valkey/REDIS_MIGRATION 425s + systemctl start valkey-server 425s + systemctl status valkey-server 425s + grep running 425s Active: active (running) since Fri 2025-02-28 03:09:20 UTC; 6ms ago 425s + sha256sum /var/lib/valkey/dump.rdb 425s edf2164ddc697288ea62479f6a360ec17270b7786ac4af0cea5e557cf5074a35 /var/lib/valkey/dump.rdb 425s + cat /etc/valkey/valkey.conf 425s + grep loglevel 425s + grep debug 425s loglevel debug 425s + valkey-cli -h 127.0.0.1 -p 6379 GET test 425s + grep 1 425s 1 425s autopkgtest [03:09:20]: test 0006-migrate-from-redis: -----------------------] 426s 0006-migrate-from-redis PASS 426s autopkgtest [03:09:21]: test 0006-migrate-from-redis: - - - - - - - - - - results - - - - - - - - - - 426s autopkgtest [03:09:21]: @@@@@@@@@@@@@@@@@@@@ summary 426s 0001-valkey-cli PASS 426s 0002-benchmark PASS 426s 0003-valkey-check-aof PASS 426s 0004-valkey-check-rdb PASS 426s 0005-cjson PASS 426s 0006-migrate-from-redis PASS 432s nova [W] Using flock in prodstack6-ppc64el 432s Creating nova instance adt-noble-ppc64el-valkey-20250228-030214-juju-7f2275-prod-proposed-migration-environment-2-ef545cde-46d0-42d7-8d0f-255c573bf3db from image adt/ubuntu-noble-ppc64el-server-20250227.img (UUID 8bfef3ab-415c-4417-9935-a648656815a2)... 432s nova [W] Timed out waiting for 03b449a8-f14a-44b4-bf51-93e28e9cd5b5 to get deleted. 432s nova [W] Using flock in prodstack6-ppc64el 432s Creating nova instance adt-noble-ppc64el-valkey-20250228-030214-juju-7f2275-prod-proposed-migration-environment-2-ef545cde-46d0-42d7-8d0f-255c573bf3db from image adt/ubuntu-noble-ppc64el-server-20250227.img (UUID 8bfef3ab-415c-4417-9935-a648656815a2)... 432s nova [W] Timed out waiting for 254a8f20-3ee7-40e7-b52e-77b06c3379bb to get deleted.