0s autopkgtest [12:21:25]: starting date and time: 2025-03-15 12:21:25+0000 0s autopkgtest [12:21:25]: git checkout: 325255d2 Merge branch 'pin-any-arch' into 'ubuntu/production' 0s autopkgtest [12:21:25]: host juju-7f2275-prod-proposed-migration-environment-20; command line: /home/ubuntu/autopkgtest/runner/autopkgtest --output-dir /tmp/autopkgtest-work.tcmqy8mz/out --timeout-copy=6000 --setup-commands /home/ubuntu/autopkgtest-cloud/worker-config-production/setup-canonical.sh --apt-pocket=proposed=src:glibc --apt-upgrade valkey --timeout-short=300 --timeout-copy=20000 --timeout-build=20000 --env=ADT_TEST_TRIGGERS=glibc/2.41-1ubuntu2 -- ssh -s /home/ubuntu/autopkgtest/ssh-setup/nova -- --flavor autopkgtest --security-groups autopkgtest-juju-7f2275-prod-proposed-migration-environment-20@bos03-arm64-23.secgroup --name adt-plucky-arm64-valkey-20250315-122125-juju-7f2275-prod-proposed-migration-environment-20-48b98757-de80-40e4-95c3-65398558a9c0 --image adt/ubuntu-plucky-arm64-server --keyname testbed-juju-7f2275-prod-proposed-migration-environment-20 --net-id=net_prod-proposed-migration -e TERM=linux -e ''"'"'http_proxy=http://squid.internal:3128'"'"'' -e ''"'"'https_proxy=http://squid.internal:3128'"'"'' -e ''"'"'no_proxy=127.0.0.1,127.0.1.1,login.ubuntu.com,localhost,localdomain,novalocal,internal,archive.ubuntu.com,ports.ubuntu.com,security.ubuntu.com,ddebs.ubuntu.com,changelogs.ubuntu.com,keyserver.ubuntu.com,launchpadlibrarian.net,launchpadcontent.net,launchpad.net,10.24.0.0/24,keystone.ps5.canonical.com,objectstorage.prodstack5.canonical.com,radosgw.ps5.canonical.com'"'"'' --mirror=http://ftpmaster.internal/ubuntu/ 177s autopkgtest [12:24:22]: testbed dpkg architecture: arm64 177s autopkgtest [12:24:22]: testbed apt version: 2.9.33 178s autopkgtest [12:24:23]: @@@@@@@@@@@@@@@@@@@@ test bed setup 178s autopkgtest [12:24:23]: testbed release detected to be: None 179s autopkgtest [12:24:24]: updating testbed package index (apt update) 179s Get:1 http://ftpmaster.internal/ubuntu plucky-proposed InRelease [126 kB] 179s Hit:2 http://ftpmaster.internal/ubuntu plucky InRelease 179s Hit:3 http://ftpmaster.internal/ubuntu plucky-updates InRelease 179s Hit:4 http://ftpmaster.internal/ubuntu plucky-security InRelease 179s Get:5 http://ftpmaster.internal/ubuntu plucky-proposed/main Sources [46.2 kB] 180s Get:6 http://ftpmaster.internal/ubuntu plucky-proposed/universe Sources [410 kB] 180s Get:7 http://ftpmaster.internal/ubuntu plucky-proposed/multiverse Sources [15.8 kB] 180s Get:8 http://ftpmaster.internal/ubuntu plucky-proposed/main arm64 Packages [78.2 kB] 180s Get:9 http://ftpmaster.internal/ubuntu plucky-proposed/main arm64 c-n-f Metadata [1888 B] 180s Get:10 http://ftpmaster.internal/ubuntu plucky-proposed/restricted arm64 c-n-f Metadata [116 B] 180s Get:11 http://ftpmaster.internal/ubuntu plucky-proposed/universe arm64 Packages [353 kB] 180s Get:12 http://ftpmaster.internal/ubuntu plucky-proposed/universe arm64 c-n-f Metadata [15.7 kB] 180s Get:13 http://ftpmaster.internal/ubuntu plucky-proposed/multiverse arm64 Packages [4948 B] 180s Get:14 http://ftpmaster.internal/ubuntu plucky-proposed/multiverse arm64 c-n-f Metadata [572 B] 181s Fetched 1052 kB in 2s (623 kB/s) 182s Reading package lists... 182s Reading package lists... 182s Building dependency tree... 182s Reading state information... 183s Calculating upgrade... 183s Calculating upgrade... 183s The following packages will be upgraded: 183s python3-jinja2 strace 184s 2 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 184s Need to get 608 kB of archives. 184s After this operation, 11.3 kB of additional disk space will be used. 184s Get:1 http://ftpmaster.internal/ubuntu plucky/main arm64 strace arm64 6.13+ds-1ubuntu1 [499 kB] 184s Get:2 http://ftpmaster.internal/ubuntu plucky/main arm64 python3-jinja2 all 3.1.5-2ubuntu1 [109 kB] 185s Fetched 608 kB in 1s (633 kB/s) 185s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117701 files and directories currently installed.) 185s Preparing to unpack .../strace_6.13+ds-1ubuntu1_arm64.deb ... 185s Unpacking strace (6.13+ds-1ubuntu1) over (6.11-0ubuntu1) ... 185s Preparing to unpack .../python3-jinja2_3.1.5-2ubuntu1_all.deb ... 185s Unpacking python3-jinja2 (3.1.5-2ubuntu1) over (3.1.5-2) ... 185s Setting up python3-jinja2 (3.1.5-2ubuntu1) ... 186s Setting up strace (6.13+ds-1ubuntu1) ... 186s Processing triggers for man-db (2.13.0-1) ... 186s Reading package lists... 187s Building dependency tree... 187s Reading state information... 187s Solving dependencies... 187s The following packages will be REMOVED: 187s libnsl2* libpython3.12-minimal* libpython3.12-stdlib* libpython3.12t64* 187s libunwind8* linux-headers-6.11.0-8* linux-headers-6.11.0-8-generic* 187s linux-image-6.11.0-8-generic* linux-modules-6.11.0-8-generic* 187s linux-tools-6.11.0-8* linux-tools-6.11.0-8-generic* 188s 0 upgraded, 0 newly installed, 11 to remove and 5 not upgraded. 188s After this operation, 267 MB disk space will be freed. 188s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117701 files and directories currently installed.) 188s Removing linux-tools-6.11.0-8-generic (6.11.0-8.8) ... 188s Removing linux-tools-6.11.0-8 (6.11.0-8.8) ... 188s Removing libpython3.12t64:arm64 (3.12.9-1) ... 188s Removing libpython3.12-stdlib:arm64 (3.12.9-1) ... 188s Removing libnsl2:arm64 (1.3.0-3build3) ... 188s Removing libpython3.12-minimal:arm64 (3.12.9-1) ... 188s Removing libunwind8:arm64 (1.6.2-3.1) ... 188s Removing linux-headers-6.11.0-8-generic (6.11.0-8.8) ... 188s Removing linux-headers-6.11.0-8 (6.11.0-8.8) ... 190s Removing linux-image-6.11.0-8-generic (6.11.0-8.8) ... 190s I: /boot/vmlinuz.old is now a symlink to vmlinuz-6.14.0-10-generic 190s I: /boot/initrd.img.old is now a symlink to initrd.img-6.14.0-10-generic 190s /etc/kernel/postrm.d/initramfs-tools: 190s update-initramfs: Deleting /boot/initrd.img-6.11.0-8-generic 190s /etc/kernel/postrm.d/zz-flash-kernel: 190s flash-kernel: Kernel 6.11.0-8-generic has been removed. 190s flash-kernel: A higher version (6.14.0-10-generic) is still installed, no reflashing required. 190s /etc/kernel/postrm.d/zz-update-grub: 190s Sourcing file `/etc/default/grub' 190s Sourcing file `/etc/default/grub.d/50-cloudimg-settings.cfg' 190s Generating grub configuration file ... 191s Found linux image: /boot/vmlinuz-6.14.0-10-generic 191s Found initrd image: /boot/initrd.img-6.14.0-10-generic 191s Warning: os-prober will not be executed to detect other bootable partitions. 191s Systems on them will not be added to the GRUB boot configuration. 191s Check GRUB_DISABLE_OS_PROBER documentation entry. 191s Adding boot menu entry for UEFI Firmware Settings ... 191s done 191s Removing linux-modules-6.11.0-8-generic (6.11.0-8.8) ... 191s Processing triggers for libc-bin (2.41-1ubuntu1) ... 191s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81650 files and directories currently installed.) 191s Purging configuration files for linux-image-6.11.0-8-generic (6.11.0-8.8) ... 192s Purging configuration files for libpython3.12-minimal:arm64 (3.12.9-1) ... 192s Purging configuration files for linux-modules-6.11.0-8-generic (6.11.0-8.8) ... 192s autopkgtest [12:24:37]: upgrading testbed (apt dist-upgrade and autopurge) 192s Reading package lists... 192s Building dependency tree... 192s Reading state information... 193s Calculating upgrade...Starting pkgProblemResolver with broken count: 0 193s Starting 2 pkgProblemResolver with broken count: 0 193s Done 194s Entering ResolveByKeep 194s 194s Calculating upgrade... 195s The following packages will be upgraded: 195s libc-bin libc-dev-bin libc6 libc6-dev locales 195s 5 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 195s Need to get 9530 kB of archives. 195s After this operation, 0 B of additional disk space will be used. 195s Get:1 http://ftpmaster.internal/ubuntu plucky-proposed/main arm64 libc6-dev arm64 2.41-1ubuntu2 [1750 kB] 197s Get:2 http://ftpmaster.internal/ubuntu plucky-proposed/main arm64 libc-dev-bin arm64 2.41-1ubuntu2 [24.0 kB] 197s Get:3 http://ftpmaster.internal/ubuntu plucky-proposed/main arm64 libc6 arm64 2.41-1ubuntu2 [2910 kB] 199s Get:4 http://ftpmaster.internal/ubuntu plucky-proposed/main arm64 libc-bin arm64 2.41-1ubuntu2 [600 kB] 200s Get:5 http://ftpmaster.internal/ubuntu plucky-proposed/main arm64 locales all 2.41-1ubuntu2 [4246 kB] 204s Preconfiguring packages ... 204s Fetched 9530 kB in 9s (1039 kB/s) 204s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81647 files and directories currently installed.) 204s Preparing to unpack .../libc6-dev_2.41-1ubuntu2_arm64.deb ... 204s Unpacking libc6-dev:arm64 (2.41-1ubuntu2) over (2.41-1ubuntu1) ... 204s Preparing to unpack .../libc-dev-bin_2.41-1ubuntu2_arm64.deb ... 204s Unpacking libc-dev-bin (2.41-1ubuntu2) over (2.41-1ubuntu1) ... 205s Preparing to unpack .../libc6_2.41-1ubuntu2_arm64.deb ... 205s Unpacking libc6:arm64 (2.41-1ubuntu2) over (2.41-1ubuntu1) ... 205s Setting up libc6:arm64 (2.41-1ubuntu2) ... 205s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81647 files and directories currently installed.) 205s Preparing to unpack .../libc-bin_2.41-1ubuntu2_arm64.deb ... 205s Unpacking libc-bin (2.41-1ubuntu2) over (2.41-1ubuntu1) ... 205s Setting up libc-bin (2.41-1ubuntu2) ... 205s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81647 files and directories currently installed.) 205s Preparing to unpack .../locales_2.41-1ubuntu2_all.deb ... 205s Unpacking locales (2.41-1ubuntu2) over (2.41-1ubuntu1) ... 206s Setting up locales (2.41-1ubuntu2) ... 206s Generating locales (this might take a while)... 208s en_US.UTF-8... done 208s Generation complete. 208s Setting up libc-dev-bin (2.41-1ubuntu2) ... 208s Setting up libc6-dev:arm64 (2.41-1ubuntu2) ... 208s Processing triggers for man-db (2.13.0-1) ... 209s Processing triggers for systemd (257.3-1ubuntu3) ... 210s Reading package lists... 211s Building dependency tree... 211s Reading state information... 211s Starting pkgProblemResolver with broken count: 0 211s Starting 2 pkgProblemResolver with broken count: 0 211s Done 211s Solving dependencies... 212s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 212s autopkgtest [12:24:57]: rebooting testbed after setup commands that affected boot 235s autopkgtest [12:25:20]: testbed running kernel: Linux 6.14.0-10-generic #10-Ubuntu SMP PREEMPT_DYNAMIC Wed Mar 12 15:45:31 UTC 2025 238s autopkgtest [12:25:23]: @@@@@@@@@@@@@@@@@@@@ apt-source valkey 244s Get:1 http://ftpmaster.internal/ubuntu plucky/universe valkey 8.0.2+dfsg1-1ubuntu1 (dsc) [2484 B] 244s Get:2 http://ftpmaster.internal/ubuntu plucky/universe valkey 8.0.2+dfsg1-1ubuntu1 (tar) [2599 kB] 244s Get:3 http://ftpmaster.internal/ubuntu plucky/universe valkey 8.0.2+dfsg1-1ubuntu1 (diff) [18.1 kB] 244s gpgv: Signature made Wed Feb 12 14:50:45 2025 UTC 244s gpgv: using RSA key 63EEFC3DE14D5146CE7F24BF34B8AD7D9529E793 244s gpgv: issuer "lena.voytek@canonical.com" 244s gpgv: Can't check signature: No public key 244s dpkg-source: warning: cannot verify inline signature for ./valkey_8.0.2+dfsg1-1ubuntu1.dsc: no acceptable signature found 244s autopkgtest [12:25:29]: testing package valkey version 8.0.2+dfsg1-1ubuntu1 246s autopkgtest [12:25:31]: build not needed 248s autopkgtest [12:25:33]: test 0001-valkey-cli: preparing testbed 248s Reading package lists... 248s Building dependency tree... 248s Reading state information... 249s Starting pkgProblemResolver with broken count: 0 249s Starting 2 pkgProblemResolver with broken count: 0 249s Done 249s The following NEW packages will be installed: 249s liblzf1 valkey-server valkey-tools 250s 0 upgraded, 3 newly installed, 0 to remove and 0 not upgraded. 250s Need to get 1308 kB of archives. 250s After this operation, 7493 kB of additional disk space will be used. 250s Get:1 http://ftpmaster.internal/ubuntu plucky/universe arm64 liblzf1 arm64 3.6-4 [7426 B] 250s Get:2 http://ftpmaster.internal/ubuntu plucky/universe arm64 valkey-tools arm64 8.0.2+dfsg1-1ubuntu1 [1252 kB] 251s Get:3 http://ftpmaster.internal/ubuntu plucky/universe arm64 valkey-server arm64 8.0.2+dfsg1-1ubuntu1 [48.5 kB] 251s Fetched 1308 kB in 2s (776 kB/s) 252s Selecting previously unselected package liblzf1:arm64. 252s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81647 files and directories currently installed.) 252s Preparing to unpack .../liblzf1_3.6-4_arm64.deb ... 252s Unpacking liblzf1:arm64 (3.6-4) ... 252s Selecting previously unselected package valkey-tools. 252s Preparing to unpack .../valkey-tools_8.0.2+dfsg1-1ubuntu1_arm64.deb ... 252s Unpacking valkey-tools (8.0.2+dfsg1-1ubuntu1) ... 252s Selecting previously unselected package valkey-server. 252s Preparing to unpack .../valkey-server_8.0.2+dfsg1-1ubuntu1_arm64.deb ... 252s Unpacking valkey-server (8.0.2+dfsg1-1ubuntu1) ... 252s Setting up liblzf1:arm64 (3.6-4) ... 252s Setting up valkey-tools (8.0.2+dfsg1-1ubuntu1) ... 252s Setting up valkey-server (8.0.2+dfsg1-1ubuntu1) ... 253s Created symlink '/etc/systemd/system/valkey.service' → '/usr/lib/systemd/system/valkey-server.service'. 253s Created symlink '/etc/systemd/system/multi-user.target.wants/valkey-server.service' → '/usr/lib/systemd/system/valkey-server.service'. 253s Processing triggers for man-db (2.13.0-1) ... 254s Processing triggers for libc-bin (2.41-1ubuntu2) ... 255s autopkgtest [12:25:40]: test 0001-valkey-cli: [----------------------- 260s # Server 260s redis_version:7.2.4 260s server_name:valkey 260s valkey_version:8.0.2 260s redis_git_sha1:00000000 260s redis_git_dirty:0 260s redis_build_id:5fe77b42c48a3400 260s server_mode:standalone 260s os:Linux 6.14.0-10-generic aarch64 260s arch_bits:64 260s monotonic_clock:POSIX clock_gettime 260s multiplexing_api:epoll 260s gcc_version:14.2.0 260s process_id:1555 260s process_supervised:systemd 260s run_id:18771bbc736ed0b1c9ab0de528fda4f9837afdc5 260s tcp_port:6379 260s server_time_usec:1742041545177359 260s uptime_in_seconds:5 260s uptime_in_days:0 260s hz:10 260s configured_hz:10 260s lru_clock:13988297 260s executable:/usr/bin/valkey-server 260s config_file:/etc/valkey/valkey.conf 260s io_threads_active:0 260s availability_zone: 260s listener0:name=tcp,bind=127.0.0.1,bind=-::1,port=6379 260s 260s # Clients 260s connected_clients:1 260s cluster_connections:0 260s maxclients:10000 260s client_recent_max_input_buffer:0 260s client_recent_max_output_buffer:0 260s blocked_clients:0 260s tracking_clients:0 260s pubsub_clients:0 260s watching_clients:0 260s clients_in_timeout_table:0 260s total_watched_keys:0 260s total_blocking_keys:0 260s total_blocking_keys_on_nokey:0 260s 260s # Memory 260s used_memory:982744 260s used_memory_human:959.71K 260s used_memory_rss:13582336 260s used_memory_rss_human:12.95M 260s used_memory_peak:982744 260s used_memory_peak_human:959.71K 260s used_memory_peak_perc:100.28% 260s used_memory_overhead:962256 260s used_memory_startup:962056 260s used_memory_dataset:20488 260s used_memory_dataset_perc:99.03% 260s allocator_allocated:4541280 260s allocator_active:9502720 260s allocator_resident:10813440 260s allocator_muzzy:0 260s total_system_memory:4088066048 260s total_system_memory_human:3.81G 260s used_memory_lua:31744 260s used_memory_vm_eval:31744 260s used_memory_lua_human:31.00K 260s used_memory_scripts_eval:0 260s number_of_cached_scripts:0 260s number_of_functions:0 260s number_of_libraries:0 260s used_memory_vm_functions:33792 260s used_memory_vm_total:65536 260s used_memory_vm_total_human:64.00K 260s used_memory_functions:200 260s used_memory_scripts:200 260s used_memory_scripts_human:200B 260s maxmemory:0 260s maxmemory_human:0B 260s maxmemory_policy:noeviction 260s allocator_frag_ratio:2.08 260s allocator_frag_bytes:4895904 260s allocator_rss_ratio:1.14 260s allocator_rss_bytes:1310720 260s rss_overhead_ratio:1.26 260s rss_overhead_bytes:2768896 260s mem_fragmentation_ratio:14.12 260s mem_fragmentation_bytes:12620136 260s mem_not_counted_for_evict:0 260s mem_replication_backlog:0 260s mem_total_replication_buffers:0 260s mem_clients_slaves:0 260s mem_clients_normal:0 260s mem_cluster_links:0 260s mem_aof_buffer:0 260s mem_allocator:jemalloc-5.3.0 260s mem_overhead_db_hashtable_rehashing:0 260s active_defrag_running:0 260s lazyfree_pending_objects:0 260s lazyfreed_objects:0 260s 260s # Persistence 260s loading:0 260s async_loading:0 260s current_cow_peak:0 260s current_cow_size:0 260s current_cow_size_age:0 260s current_fork_perc:0.00 260s current_save_keys_processed:0 260s current_save_keys_total:0 260s rdb_changes_since_last_save:0 260s rdb_bgsave_in_progress:0 260s rdb_last_save_time:1742041540 260s rdb_last_bgsave_status:ok 260s rdb_last_bgsave_time_sec:-1 260s rdb_current_bgsave_time_sec:-1 260s rdb_saves:0 260s rdb_last_cow_size:0 260s rdb_last_load_keys_expired:0 260s rdb_last_load_keys_loaded:0 260s aof_enabled:0 260s aof_rewrite_in_progress:0 260s aof_rewrite_scheduled:0 260s aof_last_rewrite_time_sec:-1 260s aof_current_rewrite_time_sec:-1 260s aof_last_bgrewrite_status:ok 260s aof_rewrites:0 260s aof_rewrites_consecutive_failures:0 260s aof_last_write_status:ok 260s aof_last_cow_size:0 260s module_fork_in_progress:0 260s module_fork_last_cow_size:0 260s 260s # Stats 260s total_connections_received:1 260s total_commands_processed:0 260s instantaneous_ops_per_sec:0 260s total_net_input_bytes:14 260s total_net_output_bytes:0 260s total_net_repl_input_bytes:0 260s total_net_repl_output_bytes:0 260s instantaneous_input_kbps:0.00 260s instantaneous_output_kbps:0.00 260s instantaneous_input_repl_kbps:0.00 260s instantaneous_output_repl_kbps:0.00 260s rejected_connections:0 260s sync_full:0 260s sync_partial_ok:0 260s sync_partial_err:0 260s expired_keys:0 260s expired_stale_perc:0.00 260s expired_time_cap_reached_count:0 260s expire_cycle_cpu_milliseconds:0 260s evicted_keys:0 260s evicted_clients:0 260s evicted_scripts:0 260s total_eviction_exceeded_time:0 260s current_eviction_exceeded_time:0 260s keyspace_hits:0 260s keyspace_misses:0 260s pubsub_channels:0 260s pubsub_patterns:0 260s pubsubshard_channels:0 260s latest_fork_usec:0 260s total_forks:0 260s migrate_cached_sockets:0 260s slave_expires_tracked_keys:0 260s active_defrag_hits:0 260s active_defrag_misses:0 260s active_defrag_key_hits:0 260s active_defrag_key_misses:0 260s total_active_defrag_time:0 260s current_active_defrag_time:0 260s tracking_total_keys:0 260s tracking_total_items:0 260s tracking_total_prefixes:0 260s unexpected_error_replies:0 260s total_error_replies:0 260s dump_payload_sanitizations:0 260s total_reads_processed:1 260s total_writes_processed:0 260s io_threaded_reads_processed:0 260s io_threaded_writes_processed:0 260s io_threaded_freed_objects:0 260s io_threaded_poll_processed:0 260s io_threaded_total_prefetch_batches:0 260s io_threaded_total_prefetch_entries:0 260s client_query_buffer_limit_disconnections:0 260s client_output_buffer_limit_disconnections:0 260s reply_buffer_shrinks:0 260s reply_buffer_expands:0 260s eventloop_cycles:51 260s eventloop_duration_sum:11629 260s eventloop_duration_cmd_sum:0 260s instantaneous_eventloop_cycles_per_sec:9 260s instantaneous_eventloop_duration_usec:221 260s acl_access_denied_auth:0 260s acl_access_denied_cmd:0 260s acl_access_denied_key:0 260s acl_access_denied_channel:0 260s 260s # Replication 260s role:master 260s connected_slaves:0 260s replicas_waiting_psync:0 260s master_failover_state:no-failover 260s master_replid:057cc4edcd315abcde296f11cc616b817403bffc 260s master_replid2:0000000000000000000000000000000000000000 260s master_repl_offset:0 260s second_repl_offset:-1 260s repl_backlog_active:0 260s repl_backlog_size:10485760 260s repl_backlog_first_byte_offset:0 260s repl_backlog_histlen:0 260s 260s # CPU 260s used_cpu_sys:0.027022 260s used_cpu_user:0.039693 260s used_cpu_sys_children:0.002352 260s used_cpu_user_children:0.001293 260s used_cpu_sys_main_thread:0.027588 260s used_cpu_user_main_thread:0.038624 260s 260s # Modules 260s 260s # Errorstats 260s 260s # Cluster 260s cluster_enabled:0 260s 260s # Keyspace 260s Redis ver. 8.0.2 260s autopkgtest [12:25:45]: test 0001-valkey-cli: -----------------------] 261s autopkgtest [12:25:46]: test 0001-valkey-cli: - - - - - - - - - - results - - - - - - - - - - 261s 0001-valkey-cli PASS 261s autopkgtest [12:25:46]: test 0002-benchmark: preparing testbed 261s Reading package lists... 262s Building dependency tree... 262s Reading state information... 262s Starting pkgProblemResolver with broken count: 0 262s Starting 2 pkgProblemResolver with broken count: 0 262s Done 263s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 264s autopkgtest [12:25:49]: test 0002-benchmark: [----------------------- 269s PING_INLINE: rps=0.0 (overall: 0.0) avg_msec=nan (overall: nan) ====== PING_INLINE ====== 269s 100000 requests completed in 0.13 seconds 269s 50 parallel clients 269s 3 bytes payload 269s keep alive: 1 269s host configuration "save": 3600 1 300 100 60 10000 269s host configuration "appendonly": no 269s multi-thread: no 269s 269s Latency by percentile distribution: 269s 0.000% <= 0.127 milliseconds (cumulative count 10) 269s 50.000% <= 0.511 milliseconds (cumulative count 51760) 269s 75.000% <= 0.615 milliseconds (cumulative count 75500) 269s 87.500% <= 0.711 milliseconds (cumulative count 88380) 269s 93.750% <= 0.767 milliseconds (cumulative count 93900) 269s 96.875% <= 0.831 milliseconds (cumulative count 96960) 269s 98.438% <= 0.903 milliseconds (cumulative count 98460) 269s 99.219% <= 0.991 milliseconds (cumulative count 99260) 269s 99.609% <= 1.463 milliseconds (cumulative count 99610) 269s 99.805% <= 2.415 milliseconds (cumulative count 99810) 269s 99.902% <= 2.583 milliseconds (cumulative count 99910) 269s 99.951% <= 2.647 milliseconds (cumulative count 99960) 269s 99.976% <= 2.679 milliseconds (cumulative count 99980) 269s 99.988% <= 2.687 milliseconds (cumulative count 99990) 269s 99.994% <= 2.711 milliseconds (cumulative count 100000) 269s 100.000% <= 2.711 milliseconds (cumulative count 100000) 269s 269s Cumulative distribution of latencies: 269s 0.000% <= 0.103 milliseconds (cumulative count 0) 269s 0.110% <= 0.207 milliseconds (cumulative count 110) 269s 6.140% <= 0.303 milliseconds (cumulative count 6140) 269s 25.330% <= 0.407 milliseconds (cumulative count 25330) 269s 49.310% <= 0.503 milliseconds (cumulative count 49310) 269s 74.140% <= 0.607 milliseconds (cumulative count 74140) 269s 87.310% <= 0.703 milliseconds (cumulative count 87310) 269s 96.210% <= 0.807 milliseconds (cumulative count 96210) 269s 98.460% <= 0.903 milliseconds (cumulative count 98460) 269s 99.330% <= 1.007 milliseconds (cumulative count 99330) 269s 99.450% <= 1.103 milliseconds (cumulative count 99450) 269s 99.550% <= 1.207 milliseconds (cumulative count 99550) 269s 99.570% <= 1.303 milliseconds (cumulative count 99570) 269s 99.580% <= 1.407 milliseconds (cumulative count 99580) 269s 99.630% <= 1.503 milliseconds (cumulative count 99630) 269s 99.660% <= 1.607 milliseconds (cumulative count 99660) 269s 99.710% <= 1.807 milliseconds (cumulative count 99710) 269s 99.730% <= 1.903 milliseconds (cumulative count 99730) 269s 99.740% <= 2.103 milliseconds (cumulative count 99740) 269s 100.000% <= 3.103 milliseconds (cumulative count 100000) 269s 269s Summary: 269s throughput summary: 751879.69 requests per second 269s latency summary (msec): 269s avg min p50 p95 p99 max 269s 0.524 0.120 0.511 0.791 0.951 2.711 269s PING_MBULK: rps=385680.0 (overall: 824102.6) avg_msec=0.366 (overall: 0.366) ====== PING_MBULK ====== 269s 100000 requests completed in 0.12 seconds 269s 50 parallel clients 269s 3 bytes payload 269s keep alive: 1 269s host configuration "save": 3600 1 300 100 60 10000 269s host configuration "appendonly": no 269s multi-thread: no 269s 269s Latency by percentile distribution: 269s 0.000% <= 0.119 milliseconds (cumulative count 10) 269s 50.000% <= 0.343 milliseconds (cumulative count 50650) 269s 75.000% <= 0.399 milliseconds (cumulative count 75440) 269s 87.500% <= 0.463 milliseconds (cumulative count 87940) 269s 93.750% <= 0.559 milliseconds (cumulative count 93900) 269s 96.875% <= 0.631 milliseconds (cumulative count 97000) 269s 98.438% <= 0.695 milliseconds (cumulative count 98600) 269s 99.219% <= 0.743 milliseconds (cumulative count 99280) 269s 99.609% <= 0.791 milliseconds (cumulative count 99670) 269s 99.805% <= 0.815 milliseconds (cumulative count 99810) 269s 99.902% <= 0.839 milliseconds (cumulative count 99910) 269s 99.951% <= 0.935 milliseconds (cumulative count 99960) 269s 99.976% <= 0.959 milliseconds (cumulative count 99980) 269s 99.988% <= 0.975 milliseconds (cumulative count 99990) 269s 99.994% <= 1.055 milliseconds (cumulative count 100000) 269s 100.000% <= 1.055 milliseconds (cumulative count 100000) 269s 269s Cumulative distribution of latencies: 269s 0.000% <= 0.103 milliseconds (cumulative count 0) 269s 0.140% <= 0.207 milliseconds (cumulative count 140) 269s 24.300% <= 0.303 milliseconds (cumulative count 24300) 269s 77.780% <= 0.407 milliseconds (cumulative count 77780) 269s 90.910% <= 0.503 milliseconds (cumulative count 90910) 269s 96.050% <= 0.607 milliseconds (cumulative count 96050) 269s 98.740% <= 0.703 milliseconds (cumulative count 98740) 269s 99.760% <= 0.807 milliseconds (cumulative count 99760) 269s 99.930% <= 0.903 milliseconds (cumulative count 99930) 269s 99.990% <= 1.007 milliseconds (cumulative count 99990) 269s 100.000% <= 1.103 milliseconds (cumulative count 100000) 269s 269s Summary: 269s throughput summary: 819672.12 requests per second 269s latency summary (msec): 269s avg min p50 p95 p99 max 269s 0.368 0.112 0.343 0.591 0.719 1.055 269s ====== SET ====== 269s 100000 requests completed in 0.16 seconds 269s 50 parallel clients 269s 3 bytes payload 269s keep alive: 1 269s host configuration "save": 3600 1 300 100 60 10000 269s host configuration "appendonly": no 269s multi-thread: no 269s 269s Latency by percentile distribution: 269s 0.000% <= 0.223 milliseconds (cumulative count 10) 269s 50.000% <= 0.663 milliseconds (cumulative count 50710) 269s 75.000% <= 0.783 milliseconds (cumulative count 75100) 269s 87.500% <= 0.871 milliseconds (cumulative count 88540) 269s 93.750% <= 0.919 milliseconds (cumulative count 94480) 269s 96.875% <= 0.951 milliseconds (cumulative count 97100) 269s 98.438% <= 0.999 milliseconds (cumulative count 98540) 269s 99.219% <= 1.055 milliseconds (cumulative count 99260) 269s 99.609% <= 1.119 milliseconds (cumulative count 99620) 269s 99.805% <= 1.183 milliseconds (cumulative count 99820) 269s 99.902% <= 1.263 milliseconds (cumulative count 99910) 269s 99.951% <= 1.351 milliseconds (cumulative count 99960) 269s 99.976% <= 1.399 milliseconds (cumulative count 99980) 269s 99.988% <= 1.415 milliseconds (cumulative count 99990) 269s 99.994% <= 1.455 milliseconds (cumulative count 100000) 269s 100.000% <= 1.455 milliseconds (cumulative count 100000) 269s 269s Cumulative distribution of latencies: 269s 0.000% <= 0.103 milliseconds (cumulative count 0) 269s 0.520% <= 0.303 milliseconds (cumulative count 520) 269s 3.790% <= 0.407 milliseconds (cumulative count 3790) 269s 8.870% <= 0.503 milliseconds (cumulative count 8870) 269s 34.720% <= 0.607 milliseconds (cumulative count 34720) 269s 59.920% <= 0.703 milliseconds (cumulative count 59920) 269s 78.760% <= 0.807 milliseconds (cumulative count 78760) 269s 92.670% <= 0.903 milliseconds (cumulative count 92670) 269s 98.690% <= 1.007 milliseconds (cumulative count 98690) 269s 99.570% <= 1.103 milliseconds (cumulative count 99570) 269s 99.860% <= 1.207 milliseconds (cumulative count 99860) 269s 99.930% <= 1.303 milliseconds (cumulative count 99930) 269s 99.980% <= 1.407 milliseconds (cumulative count 99980) 269s 100.000% <= 1.503 milliseconds (cumulative count 100000) 269s 269s Summary: 269s throughput summary: 641025.62 requests per second 269s latency summary (msec): 269s avg min p50 p95 p99 max 269s 0.679 0.216 0.663 0.927 1.031 1.455 270s GET: rps=236280.0 (overall: 694941.2) avg_msec=0.602 (overall: 0.602) ====== GET ====== 270s 100000 requests completed in 0.14 seconds 270s 50 parallel clients 270s 3 bytes payload 270s keep alive: 1 270s host configuration "save": 3600 1 300 100 60 10000 270s host configuration "appendonly": no 270s multi-thread: no 270s 270s Latency by percentile distribution: 270s 0.000% <= 0.239 milliseconds (cumulative count 20) 270s 50.000% <= 0.591 milliseconds (cumulative count 51100) 270s 75.000% <= 0.719 milliseconds (cumulative count 76020) 270s 87.500% <= 0.807 milliseconds (cumulative count 88330) 270s 93.750% <= 0.863 milliseconds (cumulative count 94210) 270s 96.875% <= 0.919 milliseconds (cumulative count 97170) 270s 98.438% <= 0.967 milliseconds (cumulative count 98530) 270s 99.219% <= 1.031 milliseconds (cumulative count 99330) 270s 99.609% <= 1.071 milliseconds (cumulative count 99630) 270s 99.805% <= 1.111 milliseconds (cumulative count 99820) 270s 99.902% <= 1.151 milliseconds (cumulative count 99910) 270s 99.951% <= 1.191 milliseconds (cumulative count 99960) 270s 99.976% <= 1.199 milliseconds (cumulative count 99980) 270s 99.988% <= 1.223 milliseconds (cumulative count 99990) 270s 99.994% <= 1.279 milliseconds (cumulative count 100000) 270s 100.000% <= 1.279 milliseconds (cumulative count 100000) 270s 270s Cumulative distribution of latencies: 270s 0.000% <= 0.103 milliseconds (cumulative count 0) 270s 1.850% <= 0.303 milliseconds (cumulative count 1850) 270s 9.870% <= 0.407 milliseconds (cumulative count 9870) 270s 24.410% <= 0.503 milliseconds (cumulative count 24410) 270s 55.340% <= 0.607 milliseconds (cumulative count 55340) 270s 73.640% <= 0.703 milliseconds (cumulative count 73640) 270s 88.330% <= 0.807 milliseconds (cumulative count 88330) 270s 96.480% <= 0.903 milliseconds (cumulative count 96480) 270s 99.020% <= 1.007 milliseconds (cumulative count 99020) 270s 99.800% <= 1.103 milliseconds (cumulative count 99800) 270s 99.980% <= 1.207 milliseconds (cumulative count 99980) 270s 100.000% <= 1.303 milliseconds (cumulative count 100000) 270s 270s Summary: 270s throughput summary: 689655.19 requests per second 270s latency summary (msec): 270s avg min p50 p95 p99 max 270s 0.607 0.232 0.591 0.879 1.007 1.279 270s ====== INCR ====== 270s 100000 requests completed in 0.15 seconds 270s 50 parallel clients 270s 3 bytes payload 270s keep alive: 1 270s host configuration "save": 3600 1 300 100 60 10000 270s host configuration "appendonly": no 270s multi-thread: no 270s 270s Latency by percentile distribution: 270s 0.000% <= 0.215 milliseconds (cumulative count 10) 270s 50.000% <= 0.599 milliseconds (cumulative count 50100) 270s 75.000% <= 0.735 milliseconds (cumulative count 75350) 270s 87.500% <= 0.823 milliseconds (cumulative count 87830) 270s 93.750% <= 0.887 milliseconds (cumulative count 94300) 270s 96.875% <= 0.943 milliseconds (cumulative count 97060) 270s 98.438% <= 0.999 milliseconds (cumulative count 98510) 270s 99.219% <= 1.063 milliseconds (cumulative count 99240) 270s 99.609% <= 1.127 milliseconds (cumulative count 99640) 270s 99.805% <= 1.175 milliseconds (cumulative count 99810) 270s 99.902% <= 1.255 milliseconds (cumulative count 99920) 270s 99.951% <= 1.287 milliseconds (cumulative count 99970) 270s 99.976% <= 1.303 milliseconds (cumulative count 99980) 270s 99.988% <= 1.319 milliseconds (cumulative count 99990) 270s 99.994% <= 1.335 milliseconds (cumulative count 100000) 270s 100.000% <= 1.335 milliseconds (cumulative count 100000) 270s 270s Cumulative distribution of latencies: 270s 0.000% <= 0.103 milliseconds (cumulative count 0) 270s 1.860% <= 0.303 milliseconds (cumulative count 1860) 270s 9.270% <= 0.407 milliseconds (cumulative count 9270) 270s 21.350% <= 0.503 milliseconds (cumulative count 21350) 270s 52.240% <= 0.607 milliseconds (cumulative count 52240) 270s 70.510% <= 0.703 milliseconds (cumulative count 70510) 270s 85.590% <= 0.807 milliseconds (cumulative count 85590) 270s 95.290% <= 0.903 milliseconds (cumulative count 95290) 270s 98.630% <= 1.007 milliseconds (cumulative count 98630) 270s 99.510% <= 1.103 milliseconds (cumulative count 99510) 270s 99.850% <= 1.207 milliseconds (cumulative count 99850) 270s 99.980% <= 1.303 milliseconds (cumulative count 99980) 270s 100.000% <= 1.407 milliseconds (cumulative count 100000) 270s 270s Summary: 270s throughput summary: 675675.69 requests per second 270s latency summary (msec): 270s avg min p50 p95 p99 max 270s 0.622 0.208 0.599 0.903 1.039 1.335 270s LPUSH: rps=90478.1 (overall: 567750.0) avg_msec=0.753 (overall: 0.753) ====== LPUSH ====== 270s 100000 requests completed in 0.17 seconds 270s 50 parallel clients 270s 3 bytes payload 270s keep alive: 1 270s host configuration "save": 3600 1 300 100 60 10000 270s host configuration "appendonly": no 270s multi-thread: no 270s 270s Latency by percentile distribution: 270s 0.000% <= 0.191 milliseconds (cumulative count 10) 270s 50.000% <= 0.735 milliseconds (cumulative count 51540) 270s 75.000% <= 0.863 milliseconds (cumulative count 76140) 270s 87.500% <= 0.943 milliseconds (cumulative count 87610) 270s 93.750% <= 0.999 milliseconds (cumulative count 94000) 270s 96.875% <= 1.047 milliseconds (cumulative count 97230) 270s 98.438% <= 1.095 milliseconds (cumulative count 98520) 270s 99.219% <= 1.151 milliseconds (cumulative count 99290) 270s 99.609% <= 1.199 milliseconds (cumulative count 99640) 270s 99.805% <= 1.263 milliseconds (cumulative count 99870) 270s 99.902% <= 1.295 milliseconds (cumulative count 99920) 270s 99.951% <= 1.327 milliseconds (cumulative count 99970) 270s 99.976% <= 1.335 milliseconds (cumulative count 99980) 270s 99.988% <= 1.343 milliseconds (cumulative count 99990) 270s 99.994% <= 1.359 milliseconds (cumulative count 100000) 270s 100.000% <= 1.359 milliseconds (cumulative count 100000) 270s 270s Cumulative distribution of latencies: 270s 0.000% <= 0.103 milliseconds (cumulative count 0) 270s 0.020% <= 0.207 milliseconds (cumulative count 20) 270s 0.360% <= 0.303 milliseconds (cumulative count 360) 270s 2.640% <= 0.407 milliseconds (cumulative count 2640) 270s 5.460% <= 0.503 milliseconds (cumulative count 5460) 270s 17.360% <= 0.607 milliseconds (cumulative count 17360) 270s 43.750% <= 0.703 milliseconds (cumulative count 43750) 270s 66.340% <= 0.807 milliseconds (cumulative count 66340) 270s 82.070% <= 0.903 milliseconds (cumulative count 82070) 270s 94.700% <= 1.007 milliseconds (cumulative count 94700) 270s 98.710% <= 1.103 milliseconds (cumulative count 98710) 270s 99.650% <= 1.207 milliseconds (cumulative count 99650) 270s 99.920% <= 1.303 milliseconds (cumulative count 99920) 270s 100.000% <= 1.407 milliseconds (cumulative count 100000) 270s 270s Summary: 270s throughput summary: 588235.31 requests per second 270s latency summary (msec): 270s avg min p50 p95 p99 max 270s 0.745 0.184 0.735 1.015 1.127 1.359 270s RPUSH: rps=302960.0 (overall: 636470.6) avg_msec=0.680 (overall: 0.680) ====== RPUSH ====== 270s 100000 requests completed in 0.16 seconds 270s 50 parallel clients 270s 3 bytes payload 270s keep alive: 1 270s host configuration "save": 3600 1 300 100 60 10000 270s host configuration "appendonly": no 270s multi-thread: no 270s 270s Latency by percentile distribution: 270s 0.000% <= 0.215 milliseconds (cumulative count 10) 270s 50.000% <= 0.671 milliseconds (cumulative count 50980) 270s 75.000% <= 0.799 milliseconds (cumulative count 75360) 270s 87.500% <= 0.879 milliseconds (cumulative count 87780) 270s 93.750% <= 0.935 milliseconds (cumulative count 94380) 270s 96.875% <= 0.983 milliseconds (cumulative count 97110) 270s 98.438% <= 1.031 milliseconds (cumulative count 98440) 270s 99.219% <= 1.095 milliseconds (cumulative count 99260) 270s 99.609% <= 1.143 milliseconds (cumulative count 99670) 270s 99.805% <= 1.167 milliseconds (cumulative count 99810) 270s 99.902% <= 1.215 milliseconds (cumulative count 99920) 270s 99.951% <= 1.247 milliseconds (cumulative count 99980) 270s 99.988% <= 1.311 milliseconds (cumulative count 99990) 270s 99.994% <= 1.319 milliseconds (cumulative count 100000) 270s 100.000% <= 1.319 milliseconds (cumulative count 100000) 270s 270s Cumulative distribution of latencies: 270s 0.000% <= 0.103 milliseconds (cumulative count 0) 270s 0.690% <= 0.303 milliseconds (cumulative count 690) 270s 4.170% <= 0.407 milliseconds (cumulative count 4170) 270s 9.600% <= 0.503 milliseconds (cumulative count 9600) 270s 33.910% <= 0.607 milliseconds (cumulative count 33910) 270s 58.030% <= 0.703 milliseconds (cumulative count 58030) 270s 76.730% <= 0.807 milliseconds (cumulative count 76730) 270s 90.870% <= 0.903 milliseconds (cumulative count 90870) 270s 97.910% <= 1.007 milliseconds (cumulative count 97910) 270s 99.350% <= 1.103 milliseconds (cumulative count 99350) 270s 99.900% <= 1.207 milliseconds (cumulative count 99900) 270s 99.980% <= 1.303 milliseconds (cumulative count 99980) 270s 100.000% <= 1.407 milliseconds (cumulative count 100000) 270s 270s Summary: 270s throughput summary: 632911.38 requests per second 270s latency summary (msec): 270s avg min p50 p95 p99 max 270s 0.685 0.208 0.671 0.951 1.071 1.319 270s ====== LPOP ====== 270s 100000 requests completed in 0.19 seconds 270s 50 parallel clients 270s 3 bytes payload 270s keep alive: 1 270s host configuration "save": 3600 1 300 100 60 10000 270s host configuration "appendonly": no 270s multi-thread: no 270s 270s Latency by percentile distribution: 270s 0.000% <= 0.247 milliseconds (cumulative count 10) 270s 50.000% <= 0.815 milliseconds (cumulative count 50000) 270s 75.000% <= 0.951 milliseconds (cumulative count 75460) 270s 87.500% <= 1.039 milliseconds (cumulative count 87960) 270s 93.750% <= 1.103 milliseconds (cumulative count 94350) 270s 96.875% <= 1.159 milliseconds (cumulative count 97060) 270s 98.438% <= 1.215 milliseconds (cumulative count 98450) 270s 99.219% <= 1.271 milliseconds (cumulative count 99330) 270s 99.609% <= 1.311 milliseconds (cumulative count 99630) 270s 99.805% <= 1.359 milliseconds (cumulative count 99820) 270s 99.902% <= 1.407 milliseconds (cumulative count 99910) 270s 99.951% <= 1.463 milliseconds (cumulative count 99960) 270s 99.976% <= 1.479 milliseconds (cumulative count 99980) 270s 99.988% <= 1.495 milliseconds (cumulative count 99990) 270s 99.994% <= 1.559 milliseconds (cumulative count 100000) 270s 100.000% <= 1.559 milliseconds (cumulative count 100000) 270s 270s Cumulative distribution of latencies: 270s 0.000% <= 0.103 milliseconds (cumulative count 0) 270s 0.150% <= 0.303 milliseconds (cumulative count 150) 270s 1.330% <= 0.407 milliseconds (cumulative count 1330) 270s 3.260% <= 0.503 milliseconds (cumulative count 3260) 270s 8.870% <= 0.607 milliseconds (cumulative count 8870) 270s 26.720% <= 0.703 milliseconds (cumulative count 26720) 270s 48.340% <= 0.807 milliseconds (cumulative count 48340) 270s 67.360% <= 0.903 milliseconds (cumulative count 67360) 270s 83.720% <= 1.007 milliseconds (cumulative count 83720) 270s 94.350% <= 1.103 milliseconds (cumulative count 94350) 270s 98.300% <= 1.207 milliseconds (cumulative count 98300) 270s 99.570% <= 1.303 milliseconds (cumulative count 99570) 270s 99.910% <= 1.407 milliseconds (cumulative count 99910) 270s 99.990% <= 1.503 milliseconds (cumulative count 99990) 270s 100.000% <= 1.607 milliseconds (cumulative count 100000) 270s 270s Summary: 270s throughput summary: 540540.56 requests per second 270s latency summary (msec): 270s avg min p50 p95 p99 max 270s 0.825 0.240 0.815 1.119 1.247 1.559 270s RPOP: rps=50360.0 (overall: 572272.8) avg_msec=0.773 (overall: 0.773) ====== RPOP ====== 270s 100000 requests completed in 0.17 seconds 270s 50 parallel clients 270s 3 bytes payload 270s keep alive: 1 270s host configuration "save": 3600 1 300 100 60 10000 270s host configuration "appendonly": no 270s multi-thread: no 270s 270s Latency by percentile distribution: 270s 0.000% <= 0.255 milliseconds (cumulative count 10) 270s 50.000% <= 0.767 milliseconds (cumulative count 51110) 270s 75.000% <= 0.895 milliseconds (cumulative count 75720) 270s 87.500% <= 0.983 milliseconds (cumulative count 88180) 270s 93.750% <= 1.039 milliseconds (cumulative count 94240) 270s 96.875% <= 1.087 milliseconds (cumulative count 96890) 270s 98.438% <= 1.143 milliseconds (cumulative count 98450) 270s 99.219% <= 1.191 milliseconds (cumulative count 99250) 270s 99.609% <= 1.239 milliseconds (cumulative count 99650) 270s 99.805% <= 1.287 milliseconds (cumulative count 99820) 270s 99.902% <= 1.335 milliseconds (cumulative count 99920) 270s 99.951% <= 1.375 milliseconds (cumulative count 99960) 270s 99.976% <= 1.399 milliseconds (cumulative count 99980) 270s 99.988% <= 1.439 milliseconds (cumulative count 99990) 270s 99.994% <= 1.463 milliseconds (cumulative count 100000) 270s 100.000% <= 1.463 milliseconds (cumulative count 100000) 270s 270s Cumulative distribution of latencies: 270s 0.000% <= 0.103 milliseconds (cumulative count 0) 270s 0.100% <= 0.303 milliseconds (cumulative count 100) 270s 1.540% <= 0.407 milliseconds (cumulative count 1540) 270s 3.740% <= 0.503 milliseconds (cumulative count 3740) 270s 13.030% <= 0.607 milliseconds (cumulative count 13030) 270s 36.540% <= 0.703 milliseconds (cumulative count 36540) 270s 59.690% <= 0.807 milliseconds (cumulative count 59690) 270s 77.130% <= 0.903 milliseconds (cumulative count 77130) 270s 91.370% <= 1.007 milliseconds (cumulative count 91370) 270s 97.390% <= 1.103 milliseconds (cumulative count 97390) 270s 99.380% <= 1.207 milliseconds (cumulative count 99380) 270s 99.850% <= 1.303 milliseconds (cumulative count 99850) 270s 99.980% <= 1.407 milliseconds (cumulative count 99980) 270s 100.000% <= 1.503 milliseconds (cumulative count 100000) 270s 270s Summary: 270s throughput summary: 571428.56 requests per second 270s latency summary (msec): 270s avg min p50 p95 p99 max 270s 0.777 0.248 0.767 1.055 1.175 1.463 271s SADD: rps=255816.8 (overall: 668854.2) avg_msec=0.629 (overall: 0.629) ====== SADD ====== 271s 100000 requests completed in 0.15 seconds 271s 50 parallel clients 271s 3 bytes payload 271s keep alive: 1 271s host configuration "save": 3600 1 300 100 60 10000 271s host configuration "appendonly": no 271s multi-thread: no 271s 271s Latency by percentile distribution: 271s 0.000% <= 0.159 milliseconds (cumulative count 10) 271s 50.000% <= 0.615 milliseconds (cumulative count 50170) 271s 75.000% <= 0.743 milliseconds (cumulative count 75400) 271s 87.500% <= 0.831 milliseconds (cumulative count 88460) 271s 93.750% <= 0.887 milliseconds (cumulative count 94500) 271s 96.875% <= 0.935 milliseconds (cumulative count 97090) 271s 98.438% <= 0.983 milliseconds (cumulative count 98560) 271s 99.219% <= 1.031 milliseconds (cumulative count 99280) 271s 99.609% <= 1.087 milliseconds (cumulative count 99630) 271s 99.805% <= 1.143 milliseconds (cumulative count 99830) 271s 99.902% <= 1.167 milliseconds (cumulative count 99910) 271s 99.951% <= 1.215 milliseconds (cumulative count 99970) 271s 99.976% <= 1.223 milliseconds (cumulative count 99980) 271s 99.988% <= 1.239 milliseconds (cumulative count 99990) 271s 99.994% <= 1.247 milliseconds (cumulative count 100000) 271s 100.000% <= 1.247 milliseconds (cumulative count 100000) 271s 271s Cumulative distribution of latencies: 271s 0.000% <= 0.103 milliseconds (cumulative count 0) 271s 0.010% <= 0.207 milliseconds (cumulative count 10) 271s 1.600% <= 0.303 milliseconds (cumulative count 1600) 271s 8.040% <= 0.407 milliseconds (cumulative count 8040) 271s 19.080% <= 0.503 milliseconds (cumulative count 19080) 271s 48.080% <= 0.607 milliseconds (cumulative count 48080) 271s 68.780% <= 0.703 milliseconds (cumulative count 68780) 271s 85.060% <= 0.807 milliseconds (cumulative count 85060) 271s 95.540% <= 0.903 milliseconds (cumulative count 95540) 271s 98.950% <= 1.007 milliseconds (cumulative count 98950) 271s 99.690% <= 1.103 milliseconds (cumulative count 99690) 271s 99.950% <= 1.207 milliseconds (cumulative count 99950) 271s 100.000% <= 1.303 milliseconds (cumulative count 100000) 271s 271s Summary: 271s throughput summary: 671140.94 requests per second 271s latency summary (msec): 271s avg min p50 p95 p99 max 271s 0.631 0.152 0.615 0.895 1.015 1.247 271s ====== HSET ====== 271s 100000 requests completed in 0.17 seconds 271s 50 parallel clients 271s 3 bytes payload 271s keep alive: 1 271s host configuration "save": 3600 1 300 100 60 10000 271s host configuration "appendonly": no 271s multi-thread: no 271s 271s Latency by percentile distribution: 271s 0.000% <= 0.223 milliseconds (cumulative count 10) 271s 50.000% <= 0.711 milliseconds (cumulative count 50400) 271s 75.000% <= 0.839 milliseconds (cumulative count 75050) 271s 87.500% <= 0.927 milliseconds (cumulative count 87940) 271s 93.750% <= 0.975 milliseconds (cumulative count 93980) 271s 96.875% <= 1.023 milliseconds (cumulative count 97160) 271s 98.438% <= 1.071 milliseconds (cumulative count 98480) 271s 99.219% <= 1.135 milliseconds (cumulative count 99260) 271s 99.609% <= 1.175 milliseconds (cumulative count 99650) 271s 99.805% <= 1.215 milliseconds (cumulative count 99810) 271s 99.902% <= 1.263 milliseconds (cumulative count 99930) 271s 99.951% <= 1.279 milliseconds (cumulative count 99960) 271s 99.976% <= 1.295 milliseconds (cumulative count 99980) 271s 99.988% <= 1.327 milliseconds (cumulative count 99990) 271s 99.994% <= 1.375 milliseconds (cumulative count 100000) 271s 100.000% <= 1.375 milliseconds (cumulative count 100000) 271s 271s Cumulative distribution of latencies: 271s 0.000% <= 0.103 milliseconds (cumulative count 0) 271s 0.530% <= 0.303 milliseconds (cumulative count 530) 271s 3.740% <= 0.407 milliseconds (cumulative count 3740) 271s 8.330% <= 0.503 milliseconds (cumulative count 8330) 271s 23.720% <= 0.607 milliseconds (cumulative count 23720) 271s 48.660% <= 0.703 milliseconds (cumulative count 48660) 271s 69.650% <= 0.807 milliseconds (cumulative count 69650) 271s 84.570% <= 0.903 milliseconds (cumulative count 84570) 271s 96.520% <= 1.007 milliseconds (cumulative count 96520) 271s 98.920% <= 1.103 milliseconds (cumulative count 98920) 271s 99.790% <= 1.207 milliseconds (cumulative count 99790) 271s 99.980% <= 1.303 milliseconds (cumulative count 99980) 271s 100.000% <= 1.407 milliseconds (cumulative count 100000) 271s 271s Summary: 271s throughput summary: 602409.69 requests per second 271s latency summary (msec): 271s avg min p50 p95 p99 max 271s 0.722 0.216 0.711 0.991 1.111 1.375 271s SPOP: rps=87040.0 (overall: 805925.9) avg_msec=0.467 (overall: 0.467) ====== SPOP ====== 271s 100000 requests completed in 0.13 seconds 271s 50 parallel clients 271s 3 bytes payload 271s keep alive: 1 271s host configuration "save": 3600 1 300 100 60 10000 271s host configuration "appendonly": no 271s multi-thread: no 271s 271s Latency by percentile distribution: 271s 0.000% <= 0.111 milliseconds (cumulative count 10) 271s 50.000% <= 0.447 milliseconds (cumulative count 50730) 271s 75.000% <= 0.551 milliseconds (cumulative count 76290) 271s 87.500% <= 0.647 milliseconds (cumulative count 88250) 271s 93.750% <= 0.719 milliseconds (cumulative count 93910) 271s 96.875% <= 0.783 milliseconds (cumulative count 97080) 271s 98.438% <= 0.855 milliseconds (cumulative count 98540) 271s 99.219% <= 0.919 milliseconds (cumulative count 99250) 271s 99.609% <= 0.991 milliseconds (cumulative count 99660) 271s 99.805% <= 1.023 milliseconds (cumulative count 99810) 271s 99.902% <= 1.087 milliseconds (cumulative count 99910) 271s 99.951% <= 1.127 milliseconds (cumulative count 99960) 271s 99.976% <= 1.151 milliseconds (cumulative count 99980) 271s 99.988% <= 1.167 milliseconds (cumulative count 99990) 271s 99.994% <= 1.191 milliseconds (cumulative count 100000) 271s 100.000% <= 1.191 milliseconds (cumulative count 100000) 271s 271s Cumulative distribution of latencies: 271s 0.000% <= 0.103 milliseconds (cumulative count 0) 271s 0.130% <= 0.207 milliseconds (cumulative count 130) 271s 9.480% <= 0.303 milliseconds (cumulative count 9480) 271s 39.710% <= 0.407 milliseconds (cumulative count 39710) 271s 65.610% <= 0.503 milliseconds (cumulative count 65610) 271s 84.260% <= 0.607 milliseconds (cumulative count 84260) 271s 92.880% <= 0.703 milliseconds (cumulative count 92880) 271s 97.670% <= 0.807 milliseconds (cumulative count 97670) 271s 99.080% <= 0.903 milliseconds (cumulative count 99080) 271s 99.730% <= 1.007 milliseconds (cumulative count 99730) 271s 99.930% <= 1.103 milliseconds (cumulative count 99930) 271s 100.000% <= 1.207 milliseconds (cumulative count 100000) 271s 271s Summary: 271s throughput summary: 787401.56 requests per second 271s latency summary (msec): 271s avg min p50 p95 p99 max 271s 0.467 0.104 0.447 0.743 0.895 1.191 271s ZADD: rps=336040.0 (overall: 567635.1) avg_msec=0.792 (overall: 0.792) ====== ZADD ====== 271s 100000 requests completed in 0.17 seconds 271s 50 parallel clients 271s 3 bytes payload 271s keep alive: 1 271s host configuration "save": 3600 1 300 100 60 10000 271s host configuration "appendonly": no 271s multi-thread: no 271s 271s Latency by percentile distribution: 271s 0.000% <= 0.343 milliseconds (cumulative count 10) 271s 50.000% <= 0.775 milliseconds (cumulative count 51330) 271s 75.000% <= 0.895 milliseconds (cumulative count 75530) 271s 87.500% <= 0.975 milliseconds (cumulative count 88060) 271s 93.750% <= 1.023 milliseconds (cumulative count 94270) 271s 96.875% <= 1.063 milliseconds (cumulative count 97100) 271s 98.438% <= 1.103 milliseconds (cumulative count 98510) 271s 99.219% <= 1.151 milliseconds (cumulative count 99280) 271s 99.609% <= 1.191 milliseconds (cumulative count 99620) 271s 99.805% <= 1.247 milliseconds (cumulative count 99820) 271s 99.902% <= 1.287 milliseconds (cumulative count 99920) 271s 99.951% <= 1.319 milliseconds (cumulative count 99960) 271s 99.976% <= 1.367 milliseconds (cumulative count 99980) 271s 99.988% <= 1.399 milliseconds (cumulative count 99990) 271s 99.994% <= 1.471 milliseconds (cumulative count 100000) 271s 100.000% <= 1.471 milliseconds (cumulative count 100000) 271s 271s Cumulative distribution of latencies: 271s 0.000% <= 0.103 milliseconds (cumulative count 0) 271s 0.150% <= 0.407 milliseconds (cumulative count 150) 271s 0.480% <= 0.503 milliseconds (cumulative count 480) 271s 7.810% <= 0.607 milliseconds (cumulative count 7810) 271s 34.120% <= 0.703 milliseconds (cumulative count 34120) 271s 58.770% <= 0.807 milliseconds (cumulative count 58770) 271s 76.790% <= 0.903 milliseconds (cumulative count 76790) 271s 92.500% <= 1.007 milliseconds (cumulative count 92500) 271s 98.510% <= 1.103 milliseconds (cumulative count 98510) 271s 99.700% <= 1.207 milliseconds (cumulative count 99700) 271s 99.930% <= 1.303 milliseconds (cumulative count 99930) 271s 99.990% <= 1.407 milliseconds (cumulative count 99990) 271s 100.000% <= 1.503 milliseconds (cumulative count 100000) 271s 271s Summary: 271s throughput summary: 571428.56 requests per second 271s latency summary (msec): 271s avg min p50 p95 p99 max 271s 0.787 0.336 0.775 1.031 1.135 1.471 271s ====== ZPOPMIN ====== 271s 100000 requests completed in 0.12 seconds 271s 50 parallel clients 271s 3 bytes payload 271s keep alive: 1 271s host configuration "save": 3600 1 300 100 60 10000 271s host configuration "appendonly": no 271s multi-thread: no 271s 271s Latency by percentile distribution: 271s 0.000% <= 0.191 milliseconds (cumulative count 10) 271s 50.000% <= 0.495 milliseconds (cumulative count 52610) 271s 75.000% <= 0.583 milliseconds (cumulative count 76490) 271s 87.500% <= 0.655 milliseconds (cumulative count 88200) 271s 93.750% <= 0.703 milliseconds (cumulative count 94430) 271s 96.875% <= 0.735 milliseconds (cumulative count 97240) 271s 98.438% <= 0.759 milliseconds (cumulative count 98470) 271s 99.219% <= 0.791 milliseconds (cumulative count 99240) 271s 99.609% <= 0.839 milliseconds (cumulative count 99670) 271s 99.805% <= 0.863 milliseconds (cumulative count 99820) 271s 99.902% <= 0.903 milliseconds (cumulative count 99910) 271s 99.951% <= 0.967 milliseconds (cumulative count 99970) 271s 99.976% <= 0.983 milliseconds (cumulative count 99990) 271s 99.994% <= 1.023 milliseconds (cumulative count 100000) 271s 100.000% <= 1.023 milliseconds (cumulative count 100000) 271s 271s Cumulative distribution of latencies: 271s 0.000% <= 0.103 milliseconds (cumulative count 0) 271s 0.020% <= 0.207 milliseconds (cumulative count 20) 271s 4.200% <= 0.303 milliseconds (cumulative count 4200) 271s 23.310% <= 0.407 milliseconds (cumulative count 23310) 271s 55.580% <= 0.503 milliseconds (cumulative count 55580) 271s 80.810% <= 0.607 milliseconds (cumulative count 80810) 271s 94.430% <= 0.703 milliseconds (cumulative count 94430) 271s 99.460% <= 0.807 milliseconds (cumulative count 99460) 271s 99.910% <= 0.903 milliseconds (cumulative count 99910) 271s 99.990% <= 1.007 milliseconds (cumulative count 99990) 271s 100.000% <= 1.103 milliseconds (cumulative count 100000) 271s 271s Summary: 271s throughput summary: 826446.31 requests per second 271s latency summary (msec): 271s avg min p50 p95 p99 max 271s 0.499 0.184 0.495 0.711 0.783 1.023 271s LPUSH (needed to benchmark LRANGE): rps=233760.0 (overall: 590303.1) avg_msec=0.753 (overall: 0.753) ====== LPUSH (needed to benchmark LRANGE) ====== 271s 100000 requests completed in 0.17 seconds 271s 50 parallel clients 271s 3 bytes payload 271s keep alive: 1 271s host configuration "save": 3600 1 300 100 60 10000 271s host configuration "appendonly": no 271s multi-thread: no 271s 271s Latency by percentile distribution: 271s 0.000% <= 0.383 milliseconds (cumulative count 20) 271s 50.000% <= 0.727 milliseconds (cumulative count 50540) 271s 75.000% <= 0.855 milliseconds (cumulative count 75640) 271s 87.500% <= 0.935 milliseconds (cumulative count 88030) 271s 93.750% <= 0.983 milliseconds (cumulative count 93950) 271s 96.875% <= 1.023 milliseconds (cumulative count 97070) 271s 98.438% <= 1.063 milliseconds (cumulative count 98520) 271s 99.219% <= 1.111 milliseconds (cumulative count 99300) 271s 99.609% <= 1.175 milliseconds (cumulative count 99630) 271s 99.805% <= 1.247 milliseconds (cumulative count 99810) 271s 99.902% <= 1.311 milliseconds (cumulative count 99920) 271s 99.951% <= 1.367 milliseconds (cumulative count 99960) 271s 99.976% <= 1.383 milliseconds (cumulative count 99980) 271s 99.988% <= 1.399 milliseconds (cumulative count 99990) 271s 99.994% <= 1.431 milliseconds (cumulative count 100000) 271s 100.000% <= 1.431 milliseconds (cumulative count 100000) 271s 271s Cumulative distribution of latencies: 271s 0.000% <= 0.103 milliseconds (cumulative count 0) 271s 0.080% <= 0.407 milliseconds (cumulative count 80) 271s 0.660% <= 0.503 milliseconds (cumulative count 660) 271s 14.690% <= 0.607 milliseconds (cumulative count 14690) 271s 44.590% <= 0.703 milliseconds (cumulative count 44590) 271s 67.450% <= 0.807 milliseconds (cumulative count 67450) 271s 83.110% <= 0.903 milliseconds (cumulative count 83110) 271s 96.060% <= 1.007 milliseconds (cumulative count 96060) 271s 99.200% <= 1.103 milliseconds (cumulative count 99200) 271s 99.720% <= 1.207 milliseconds (cumulative count 99720) 271s 99.900% <= 1.303 milliseconds (cumulative count 99900) 271s 99.990% <= 1.407 milliseconds (cumulative count 99990) 271s 100.000% <= 1.503 milliseconds (cumulative count 100000) 271s 271s Summary: 271s throughput summary: 595238.12 requests per second 271s latency summary (msec): 271s avg min p50 p95 p99 max 271s 0.751 0.376 0.727 0.999 1.095 1.431 272s LRANGE_100 (first 100 elements): rps=95617.5 (overall: 133333.3) avg_msec=1.966 (overall: 1.966) LRANGE_100 (first 100 elements): rps=134263.0 (overall: 133874.7) avg_msec=1.918 (overall: 1.938) LRANGE_100 (first 100 elements): rps=136055.8 (overall: 134677.4) avg_msec=1.875 (overall: 1.914) ====== LRANGE_100 (first 100 elements) ====== 272s 100000 requests completed in 0.74 seconds 272s 50 parallel clients 272s 3 bytes payload 272s keep alive: 1 272s host configuration "save": 3600 1 300 100 60 10000 272s host configuration "appendonly": no 272s multi-thread: no 272s 272s Latency by percentile distribution: 272s 0.000% <= 0.431 milliseconds (cumulative count 10) 272s 50.000% <= 1.879 milliseconds (cumulative count 51050) 272s 75.000% <= 1.951 milliseconds (cumulative count 77120) 272s 87.500% <= 2.015 milliseconds (cumulative count 87960) 272s 93.750% <= 2.103 milliseconds (cumulative count 93990) 272s 96.875% <= 2.191 milliseconds (cumulative count 96880) 272s 98.438% <= 2.311 milliseconds (cumulative count 98480) 272s 99.219% <= 2.591 milliseconds (cumulative count 99230) 272s 99.609% <= 3.919 milliseconds (cumulative count 99610) 272s 99.805% <= 4.759 milliseconds (cumulative count 99810) 272s 99.902% <= 5.199 milliseconds (cumulative count 99910) 272s 99.951% <= 5.615 milliseconds (cumulative count 99960) 272s 99.976% <= 5.687 milliseconds (cumulative count 99980) 272s 99.988% <= 5.751 milliseconds (cumulative count 99990) 272s 99.994% <= 5.823 milliseconds (cumulative count 100000) 272s 100.000% <= 5.823 milliseconds (cumulative count 100000) 272s 272s Cumulative distribution of latencies: 272s 0.000% <= 0.103 milliseconds (cumulative count 0) 272s 0.010% <= 0.503 milliseconds (cumulative count 10) 272s 0.020% <= 1.007 milliseconds (cumulative count 20) 272s 0.030% <= 1.103 milliseconds (cumulative count 30) 272s 0.050% <= 1.207 milliseconds (cumulative count 50) 272s 0.060% <= 1.303 milliseconds (cumulative count 60) 272s 0.070% <= 1.407 milliseconds (cumulative count 70) 272s 0.090% <= 1.503 milliseconds (cumulative count 90) 272s 0.300% <= 1.607 milliseconds (cumulative count 300) 272s 1.160% <= 1.703 milliseconds (cumulative count 1160) 272s 15.500% <= 1.807 milliseconds (cumulative count 15500) 272s 61.490% <= 1.903 milliseconds (cumulative count 61490) 272s 86.790% <= 2.007 milliseconds (cumulative count 86790) 272s 93.990% <= 2.103 milliseconds (cumulative count 93990) 272s 99.500% <= 3.103 milliseconds (cumulative count 99500) 272s 99.650% <= 4.103 milliseconds (cumulative count 99650) 272s 99.890% <= 5.103 milliseconds (cumulative count 99890) 272s 100.000% <= 6.103 milliseconds (cumulative count 100000) 272s 272s Summary: 272s throughput summary: 134770.89 requests per second 272s latency summary (msec): 272s avg min p50 p95 p99 max 272s 1.912 0.424 1.879 2.127 2.447 5.823 275s LRANGE_300 (first 300 elements): rps=24555.6 (overall: 32568.4) avg_msec=7.956 (overall: 7.956) LRANGE_300 (first 300 elements): rps=33322.7 (overall: 32997.7) avg_msec=7.796 (overall: 7.864) LRANGE_300 (first 300 elements): rps=33713.1 (overall: 33257.2) avg_msec=7.802 (overall: 7.841) LRANGE_300 (first 300 elements): rps=34480.0 (overall: 33581.7) avg_msec=7.290 (overall: 7.691) LRANGE_300 (first 300 elements): rps=34800.0 (overall: 33841.3) avg_msec=7.111 (overall: 7.564) LRANGE_300 (first 300 elements): rps=34597.6 (overall: 33972.4) avg_msec=7.169 (overall: 7.494) LRANGE_300 (first 300 elements): rps=34250.0 (overall: 34014.1) avg_msec=7.189 (overall: 7.448) LRANGE_300 (first 300 elements): rps=34299.2 (overall: 34051.1) avg_msec=7.405 (overall: 7.442) LRANGE_300 (first 300 elements): rps=33419.6 (overall: 33978.3) avg_msec=7.826 (overall: 7.486) LRANGE_300 (first 300 elements): rps=35083.0 (overall: 34091.6) avg_msec=6.876 (overall: 7.421) LRANGE_300 (first 300 elements): rps=34270.9 (overall: 34108.2) avg_msec=7.458 (overall: 7.425) ====== LRANGE_300 (first 300 elements) ====== 275s 100000 requests completed in 2.96 seconds 275s 50 parallel clients 275s 3 bytes payload 275s keep alive: 1 275s host configuration "save": 3600 1 300 100 60 10000 275s host configuration "appendonly": no 275s multi-thread: no 275s 275s Latency by percentile distribution: 275s 0.000% <= 0.687 milliseconds (cumulative count 10) 275s 50.000% <= 7.151 milliseconds (cumulative count 50080) 275s 75.000% <= 8.791 milliseconds (cumulative count 75060) 275s 87.500% <= 10.607 milliseconds (cumulative count 87520) 275s 93.750% <= 11.951 milliseconds (cumulative count 93760) 275s 96.875% <= 13.543 milliseconds (cumulative count 96880) 275s 98.438% <= 14.743 milliseconds (cumulative count 98450) 275s 99.219% <= 15.935 milliseconds (cumulative count 99220) 275s 99.609% <= 16.879 milliseconds (cumulative count 99610) 275s 99.805% <= 17.983 milliseconds (cumulative count 99810) 275s 99.902% <= 18.671 milliseconds (cumulative count 99910) 275s 99.951% <= 19.823 milliseconds (cumulative count 99960) 275s 99.976% <= 21.135 milliseconds (cumulative count 99980) 275s 99.988% <= 21.343 milliseconds (cumulative count 99990) 275s 99.994% <= 21.519 milliseconds (cumulative count 100000) 275s 100.000% <= 21.519 milliseconds (cumulative count 100000) 275s 275s Cumulative distribution of latencies: 275s 0.000% <= 0.103 milliseconds (cumulative count 0) 275s 0.010% <= 0.703 milliseconds (cumulative count 10) 275s 0.020% <= 0.807 milliseconds (cumulative count 20) 275s 0.050% <= 0.903 milliseconds (cumulative count 50) 275s 0.070% <= 1.007 milliseconds (cumulative count 70) 275s 0.150% <= 1.103 milliseconds (cumulative count 150) 275s 0.220% <= 1.207 milliseconds (cumulative count 220) 275s 0.280% <= 1.303 milliseconds (cumulative count 280) 275s 0.340% <= 1.407 milliseconds (cumulative count 340) 275s 0.380% <= 1.503 milliseconds (cumulative count 380) 275s 0.460% <= 1.607 milliseconds (cumulative count 460) 275s 0.490% <= 1.703 milliseconds (cumulative count 490) 275s 0.540% <= 1.807 milliseconds (cumulative count 540) 275s 0.580% <= 1.903 milliseconds (cumulative count 580) 275s 0.660% <= 2.007 milliseconds (cumulative count 660) 275s 0.740% <= 2.103 milliseconds (cumulative count 740) 275s 1.970% <= 3.103 milliseconds (cumulative count 1970) 275s 5.530% <= 4.103 milliseconds (cumulative count 5530) 275s 14.060% <= 5.103 milliseconds (cumulative count 14060) 275s 29.830% <= 6.103 milliseconds (cumulative count 29830) 275s 49.270% <= 7.103 milliseconds (cumulative count 49270) 275s 66.320% <= 8.103 milliseconds (cumulative count 66320) 275s 78.120% <= 9.103 milliseconds (cumulative count 78120) 275s 84.820% <= 10.103 milliseconds (cumulative count 84820) 275s 90.250% <= 11.103 milliseconds (cumulative count 90250) 275s 94.140% <= 12.103 milliseconds (cumulative count 94140) 275s 96.200% <= 13.103 milliseconds (cumulative count 96200) 275s 97.660% <= 14.103 milliseconds (cumulative count 97660) 275s 98.730% <= 15.103 milliseconds (cumulative count 98730) 275s 99.320% <= 16.103 milliseconds (cumulative count 99320) 275s 99.650% <= 17.103 milliseconds (cumulative count 99650) 275s 99.830% <= 18.111 milliseconds (cumulative count 99830) 275s 99.930% <= 19.103 milliseconds (cumulative count 99930) 275s 99.970% <= 20.111 milliseconds (cumulative count 99970) 275s 100.000% <= 22.111 milliseconds (cumulative count 100000) 275s 275s Summary: 275s throughput summary: 33806.62 requests per second 275s latency summary (msec): 275s avg min p50 p95 p99 max 275s 7.540 0.680 7.151 12.447 15.551 21.519 280s LRANGE_500 (first 500 elements): rps=39.8 (overall: 1250.0) avg_msec=0.524 (overall: 0.524) LRANGE_500 (first 500 elements): rps=18565.2 (overall: 18034.5) avg_msec=13.513 (overall: 13.485) LRANGE_500 (first 500 elements): rps=18110.2 (overall: 18071.8) avg_msec=12.249 (overall: 12.874) LRANGE_500 (first 500 elements): rps=18644.3 (overall: 18260.4) avg_msec=12.446 (overall: 12.730) LRANGE_500 (first 500 elements): rps=17872.5 (overall: 18164.9) avg_msec=11.995 (overall: 12.552) LRANGE_500 (first 500 elements): rps=18601.6 (overall: 18252.5) avg_msec=12.593 (overall: 12.560) LRANGE_500 (first 500 elements): rps=18783.5 (overall: 18340.7) avg_msec=10.602 (overall: 12.227) LRANGE_500 (first 500 elements): rps=18789.9 (overall: 18405.4) avg_msec=11.250 (overall: 12.084) LRANGE_500 (first 500 elements): rps=19102.4 (overall: 18492.2) avg_msec=10.079 (overall: 11.826) LRANGE_500 (first 500 elements): rps=19444.4 (overall: 18596.9) avg_msec=10.035 (overall: 11.620) LRANGE_500 (first 500 elements): rps=19189.7 (overall: 18655.8) avg_msec=10.358 (overall: 11.491) LRANGE_500 (first 500 elements): rps=19191.4 (overall: 18704.7) avg_msec=11.496 (overall: 11.491) LRANGE_500 (first 500 elements): rps=18573.7 (overall: 18694.0) avg_msec=11.128 (overall: 11.462) LRANGE_500 (first 500 elements): rps=19178.6 (overall: 18730.9) avg_msec=10.978 (overall: 11.424) LRANGE_500 (first 500 elements): rps=19561.8 (overall: 18789.6) avg_msec=9.855 (overall: 11.309) LRANGE_500 (first 500 elements): rps=19178.6 (overall: 18815.3) avg_msec=10.700 (overall: 11.267) LRANGE_500 (first 500 elements): rps=19201.6 (overall: 18839.4) avg_msec=10.293 (overall: 11.206) LRANGE_500 (first 500 elements): rps=19271.7 (overall: 18864.9) avg_msec=10.549 (overall: 11.166) LRANGE_500 (first 500 elements): rps=19239.2 (overall: 18885.8) avg_msec=10.336 (overall: 11.119) LRANGE_500 (first 500 elements): rps=18438.7 (overall: 18862.3) avg_msec=12.427 (overall: 11.186) LRANGE_500 (first 500 elements): rps=17450.6 (overall: 18791.9) avg_msec=13.999 (overall: 11.316) ====== LRANGE_500 (first 500 elements) ====== 280s 100000 requests completed in 5.32 seconds 280s 50 parallel clients 280s 3 bytes payload 280s keep alive: 1 280s host configuration "save": 3600 1 300 100 60 10000 280s host configuration "appendonly": no 280s multi-thread: no 280s 280s Latency by percentile distribution: 280s 0.000% <= 0.527 milliseconds (cumulative count 10) 280s 50.000% <= 10.631 milliseconds (cumulative count 50000) 280s 75.000% <= 13.679 milliseconds (cumulative count 75000) 280s 87.500% <= 16.159 milliseconds (cumulative count 87510) 280s 93.750% <= 18.655 milliseconds (cumulative count 93760) 280s 96.875% <= 20.863 milliseconds (cumulative count 96890) 280s 98.438% <= 22.415 milliseconds (cumulative count 98440) 280s 99.219% <= 23.647 milliseconds (cumulative count 99220) 280s 99.609% <= 25.055 milliseconds (cumulative count 99610) 280s 99.805% <= 26.239 milliseconds (cumulative count 99810) 280s 99.902% <= 27.103 milliseconds (cumulative count 99910) 280s 99.951% <= 27.471 milliseconds (cumulative count 99960) 280s 99.976% <= 27.823 milliseconds (cumulative count 99980) 280s 99.988% <= 27.983 milliseconds (cumulative count 99990) 280s 99.994% <= 28.175 milliseconds (cumulative count 100000) 280s 100.000% <= 28.175 milliseconds (cumulative count 100000) 280s 280s Cumulative distribution of latencies: 280s 0.000% <= 0.103 milliseconds (cumulative count 0) 280s 0.010% <= 0.607 milliseconds (cumulative count 10) 280s 0.030% <= 1.303 milliseconds (cumulative count 30) 280s 0.040% <= 1.407 milliseconds (cumulative count 40) 280s 0.070% <= 1.503 milliseconds (cumulative count 70) 280s 0.090% <= 1.607 milliseconds (cumulative count 90) 280s 0.100% <= 1.703 milliseconds (cumulative count 100) 280s 0.140% <= 1.807 milliseconds (cumulative count 140) 280s 0.150% <= 1.903 milliseconds (cumulative count 150) 280s 0.170% <= 2.007 milliseconds (cumulative count 170) 280s 0.180% <= 2.103 milliseconds (cumulative count 180) 280s 0.390% <= 3.103 milliseconds (cumulative count 390) 280s 1.510% <= 4.103 milliseconds (cumulative count 1510) 280s 3.450% <= 5.103 milliseconds (cumulative count 3450) 280s 6.820% <= 6.103 milliseconds (cumulative count 6820) 280s 12.400% <= 7.103 milliseconds (cumulative count 12400) 280s 21.400% <= 8.103 milliseconds (cumulative count 21400) 280s 32.640% <= 9.103 milliseconds (cumulative count 32640) 280s 44.500% <= 10.103 milliseconds (cumulative count 44500) 280s 54.460% <= 11.103 milliseconds (cumulative count 54460) 280s 62.890% <= 12.103 milliseconds (cumulative count 62890) 280s 70.760% <= 13.103 milliseconds (cumulative count 70760) 280s 78.060% <= 14.103 milliseconds (cumulative count 78060) 280s 83.460% <= 15.103 milliseconds (cumulative count 83460) 280s 87.310% <= 16.103 milliseconds (cumulative count 87310) 280s 90.410% <= 17.103 milliseconds (cumulative count 90410) 280s 92.740% <= 18.111 milliseconds (cumulative count 92740) 280s 94.370% <= 19.103 milliseconds (cumulative count 94370) 280s 95.960% <= 20.111 milliseconds (cumulative count 95960) 280s 97.180% <= 21.103 milliseconds (cumulative count 97180) 280s 98.140% <= 22.111 milliseconds (cumulative count 98140) 280s 98.930% <= 23.103 milliseconds (cumulative count 98930) 280s 99.350% <= 24.111 milliseconds (cumulative count 99350) 280s 99.620% <= 25.103 milliseconds (cumulative count 99620) 280s 99.800% <= 26.111 milliseconds (cumulative count 99800) 280s 99.910% <= 27.103 milliseconds (cumulative count 99910) 280s 99.990% <= 28.111 milliseconds (cumulative count 99990) 280s 100.000% <= 29.103 milliseconds (cumulative count 100000) 280s 280s Summary: 280s throughput summary: 18807.60 requests per second 280s latency summary (msec): 280s avg min p50 p95 p99 max 280s 11.338 0.520 10.631 19.535 23.231 28.175 287s LRANGE_600 (first 600 elements): rps=39.8 (overall: 1428.6) avg_msec=0.460 (overall: 0.460) LRANGE_600 (first 600 elements): rps=14921.3 (overall: 14559.4) avg_msec=16.027 (overall: 15.986) LRANGE_600 (first 600 elements): rps=16551.6 (overall: 15538.0) avg_msec=10.906 (overall: 13.328) LRANGE_600 (first 600 elements): rps=15917.0 (overall: 15663.2) avg_msec=12.663 (overall: 13.105) LRANGE_600 (first 600 elements): rps=13968.0 (overall: 15246.1) avg_msec=17.849 (overall: 14.174) LRANGE_600 (first 600 elements): rps=16295.3 (overall: 15455.9) avg_msec=12.763 (overall: 13.877) LRANGE_600 (first 600 elements): rps=13820.0 (overall: 15186.8) avg_msec=17.059 (overall: 14.353) LRANGE_600 (first 600 elements): rps=13047.6 (overall: 14882.6) avg_msec=19.329 (overall: 14.973) LRANGE_600 (first 600 elements): rps=14202.4 (overall: 14797.9) avg_msec=16.926 (overall: 15.207) LRANGE_600 (first 600 elements): rps=13502.0 (overall: 14654.9) avg_msec=18.449 (overall: 15.536) LRANGE_600 (first 600 elements): rps=16039.8 (overall: 14792.6) avg_msec=13.813 (overall: 15.350) LRANGE_600 (first 600 elements): rps=14341.3 (overall: 14751.6) avg_msec=16.387 (overall: 15.442) LRANGE_600 (first 600 elements): rps=14147.9 (overall: 14700.5) avg_msec=17.493 (overall: 15.609) LRANGE_600 (first 600 elements): rps=14317.5 (overall: 14671.1) avg_msec=15.976 (overall: 15.636) LRANGE_600 (first 600 elements): rps=16820.3 (overall: 14826.4) avg_msec=10.940 (overall: 15.251) LRANGE_600 (first 600 elements): rps=14641.4 (overall: 14814.2) avg_msec=14.942 (overall: 15.231) LRANGE_600 (first 600 elements): rps=16242.1 (overall: 14903.1) avg_msec=13.602 (overall: 15.121) LRANGE_600 (first 600 elements): rps=16932.5 (overall: 15022.1) avg_msec=9.564 (overall: 14.753) LRANGE_600 (first 600 elements): rps=16800.8 (overall: 15122.1) avg_msec=10.443 (overall: 14.484) LRANGE_600 (first 600 elements): rps=15552.9 (overall: 15144.9) avg_msec=13.134 (overall: 14.411) LRANGE_600 (first 600 elements): rps=14545.8 (overall: 15115.2) avg_msec=16.011 (overall: 14.487) LRANGE_600 (first 600 elements): rps=13398.4 (overall: 15034.1) avg_msec=19.378 (overall: 14.693) LRANGE_600 (first 600 elements): rps=14201.6 (overall: 14995.5) avg_msec=16.799 (overall: 14.786) LRANGE_600 (first 600 elements): rps=13545.8 (overall: 14933.0) avg_msec=17.564 (overall: 14.894) LRANGE_600 (first 600 elements): rps=13925.5 (overall: 14890.7) avg_msec=18.594 (overall: 15.039) LRANGE_600 (first 600 elements): rps=13812.0 (overall: 14848.1) avg_msec=17.176 (overall: 15.118) LRANGE_600 (first 600 elements): rps=16364.7 (overall: 14906.8) avg_msec=13.448 (overall: 15.047) ====== LRANGE_600 (first 600 elements) ====== 287s 100000 requests completed in 6.70 seconds 287s 50 parallel clients 287s 3 bytes payload 287s keep alive: 1 287s host configuration "save": 3600 1 300 100 60 10000 287s host configuration "appendonly": no 287s multi-thread: no 287s 287s Latency by percentile distribution: 287s 0.000% <= 0.463 milliseconds (cumulative count 10) 287s 50.000% <= 14.535 milliseconds (cumulative count 50020) 287s 75.000% <= 20.431 milliseconds (cumulative count 75040) 287s 87.500% <= 23.631 milliseconds (cumulative count 87530) 287s 93.750% <= 25.455 milliseconds (cumulative count 93760) 287s 96.875% <= 26.879 milliseconds (cumulative count 96900) 287s 98.438% <= 28.079 milliseconds (cumulative count 98450) 287s 99.219% <= 29.807 milliseconds (cumulative count 99220) 287s 99.609% <= 30.559 milliseconds (cumulative count 99620) 287s 99.805% <= 31.231 milliseconds (cumulative count 99810) 287s 99.902% <= 31.919 milliseconds (cumulative count 99910) 287s 99.951% <= 32.287 milliseconds (cumulative count 99960) 287s 99.976% <= 32.479 milliseconds (cumulative count 99980) 287s 99.988% <= 32.639 milliseconds (cumulative count 99990) 287s 99.994% <= 32.671 milliseconds (cumulative count 100000) 287s 100.000% <= 32.671 milliseconds (cumulative count 100000) 287s 287s Cumulative distribution of latencies: 287s 0.000% <= 0.103 milliseconds (cumulative count 0) 287s 0.010% <= 0.503 milliseconds (cumulative count 10) 287s 0.020% <= 0.903 milliseconds (cumulative count 20) 287s 0.030% <= 1.007 milliseconds (cumulative count 30) 287s 0.060% <= 1.103 milliseconds (cumulative count 60) 287s 0.150% <= 1.207 milliseconds (cumulative count 150) 287s 0.180% <= 1.303 milliseconds (cumulative count 180) 287s 0.340% <= 1.407 milliseconds (cumulative count 340) 287s 0.440% <= 1.503 milliseconds (cumulative count 440) 287s 0.620% <= 1.607 milliseconds (cumulative count 620) 287s 0.780% <= 1.703 milliseconds (cumulative count 780) 287s 0.980% <= 1.807 milliseconds (cumulative count 980) 287s 1.210% <= 1.903 milliseconds (cumulative count 1210) 287s 1.460% <= 2.007 milliseconds (cumulative count 1460) 287s 1.630% <= 2.103 milliseconds (cumulative count 1630) 287s 2.420% <= 3.103 milliseconds (cumulative count 2420) 287s 3.520% <= 4.103 milliseconds (cumulative count 3520) 287s 5.540% <= 5.103 milliseconds (cumulative count 5540) 287s 7.760% <= 6.103 milliseconds (cumulative count 7760) 287s 11.350% <= 7.103 milliseconds (cumulative count 11350) 287s 16.560% <= 8.103 milliseconds (cumulative count 16560) 287s 22.570% <= 9.103 milliseconds (cumulative count 22570) 287s 28.290% <= 10.103 milliseconds (cumulative count 28290) 287s 34.230% <= 11.103 milliseconds (cumulative count 34230) 287s 39.140% <= 12.103 milliseconds (cumulative count 39140) 287s 43.710% <= 13.103 milliseconds (cumulative count 43710) 287s 48.070% <= 14.103 milliseconds (cumulative count 48070) 287s 52.820% <= 15.103 milliseconds (cumulative count 52820) 287s 56.550% <= 16.103 milliseconds (cumulative count 56550) 287s 61.210% <= 17.103 milliseconds (cumulative count 61210) 287s 65.960% <= 18.111 milliseconds (cumulative count 65960) 287s 69.950% <= 19.103 milliseconds (cumulative count 69950) 287s 73.790% <= 20.111 milliseconds (cumulative count 73790) 287s 77.820% <= 21.103 milliseconds (cumulative count 77820) 287s 81.670% <= 22.111 milliseconds (cumulative count 81670) 287s 85.420% <= 23.103 milliseconds (cumulative count 85420) 287s 89.430% <= 24.111 milliseconds (cumulative count 89430) 287s 92.830% <= 25.103 milliseconds (cumulative count 92830) 287s 95.350% <= 26.111 milliseconds (cumulative count 95350) 287s 97.280% <= 27.103 milliseconds (cumulative count 97280) 287s 98.490% <= 28.111 milliseconds (cumulative count 98490) 287s 98.930% <= 29.103 milliseconds (cumulative count 98930) 287s 99.350% <= 30.111 milliseconds (cumulative count 99350) 287s 99.790% <= 31.103 milliseconds (cumulative count 99790) 287s 99.940% <= 32.111 milliseconds (cumulative count 99940) 287s 100.000% <= 33.119 milliseconds (cumulative count 100000) 287s 287s Summary: 287s throughput summary: 14929.83 requests per second 287s latency summary (msec): 287s avg min p50 p95 p99 max 287s 14.993 0.456 14.535 25.967 29.311 32.671 287s MSET (10 keys): rps=138605.6 (overall: 265572.5) avg_msec=1.752 (overall: 1.752) ====== MSET (10 keys) ====== 287s 100000 requests completed in 0.37 seconds 287s 50 parallel clients 287s 3 bytes payload 287s keep alive: 1 287s host configuration "save": 3600 1 300 100 60 10000 287s host configuration "appendonly": no 287s multi-thread: no 287s 287s Latency by percentile distribution: 287s 0.000% <= 0.439 milliseconds (cumulative count 10) 287s 50.000% <= 1.799 milliseconds (cumulative count 51270) 287s 75.000% <= 1.943 milliseconds (cumulative count 75710) 287s 87.500% <= 2.031 milliseconds (cumulative count 88190) 287s 93.750% <= 2.095 milliseconds (cumulative count 94280) 287s 96.875% <= 2.143 milliseconds (cumulative count 96880) 287s 98.438% <= 2.199 milliseconds (cumulative count 98510) 287s 99.219% <= 2.263 milliseconds (cumulative count 99230) 287s 99.609% <= 2.351 milliseconds (cumulative count 99630) 287s 99.805% <= 2.455 milliseconds (cumulative count 99820) 287s 99.902% <= 2.527 milliseconds (cumulative count 99910) 287s 99.951% <= 2.567 milliseconds (cumulative count 99960) 287s 99.976% <= 2.583 milliseconds (cumulative count 99980) 287s 99.988% <= 2.591 milliseconds (cumulative count 99990) 287s 99.994% <= 2.607 milliseconds (cumulative count 100000) 287s 100.000% <= 2.607 milliseconds (cumulative count 100000) 287s 287s Cumulative distribution of latencies: 287s 0.000% <= 0.103 milliseconds (cumulative count 0) 287s 0.050% <= 0.503 milliseconds (cumulative count 50) 287s 0.110% <= 0.607 milliseconds (cumulative count 110) 287s 0.130% <= 0.807 milliseconds (cumulative count 130) 287s 0.140% <= 0.903 milliseconds (cumulative count 140) 287s 0.360% <= 1.007 milliseconds (cumulative count 360) 287s 2.500% <= 1.103 milliseconds (cumulative count 2500) 287s 9.730% <= 1.207 milliseconds (cumulative count 9730) 287s 12.690% <= 1.303 milliseconds (cumulative count 12690) 287s 13.460% <= 1.407 milliseconds (cumulative count 13460) 287s 14.720% <= 1.503 milliseconds (cumulative count 14720) 287s 21.020% <= 1.607 milliseconds (cumulative count 21020) 287s 33.990% <= 1.703 milliseconds (cumulative count 33990) 287s 52.610% <= 1.807 milliseconds (cumulative count 52610) 287s 68.970% <= 1.903 milliseconds (cumulative count 68970) 287s 85.060% <= 2.007 milliseconds (cumulative count 85060) 287s 94.740% <= 2.103 milliseconds (cumulative count 94740) 287s 100.000% <= 3.103 milliseconds (cumulative count 100000) 287s 287s Summary: 287s throughput summary: 268817.19 requests per second 287s latency summary (msec): 287s avg min p50 p95 p99 max 287s 1.748 0.432 1.799 2.111 2.239 2.607 288s XADD: rps=10757.0 (overall: 337500.0) avg_msec=1.093 (overall: 1.093) ====== XADD ====== 288s 100000 requests completed in 0.23 seconds 288s 50 parallel clients 288s 3 bytes payload 288s keep alive: 1 288s host configuration "save": 3600 1 300 100 60 10000 288s host configuration "appendonly": no 288s multi-thread: no 288s 288s Latency by percentile distribution: 288s 0.000% <= 0.335 milliseconds (cumulative count 20) 288s 50.000% <= 1.103 milliseconds (cumulative count 50530) 288s 75.000% <= 1.239 milliseconds (cumulative count 75920) 288s 87.500% <= 1.319 milliseconds (cumulative count 88170) 288s 93.750% <= 1.375 milliseconds (cumulative count 93960) 288s 96.875% <= 1.431 milliseconds (cumulative count 97100) 288s 98.438% <= 1.479 milliseconds (cumulative count 98510) 288s 99.219% <= 1.543 milliseconds (cumulative count 99290) 288s 99.609% <= 1.599 milliseconds (cumulative count 99670) 288s 99.805% <= 1.631 milliseconds (cumulative count 99820) 288s 99.902% <= 1.671 milliseconds (cumulative count 99920) 288s 99.951% <= 1.703 milliseconds (cumulative count 99960) 288s 99.976% <= 1.735 milliseconds (cumulative count 99980) 288s 99.988% <= 1.799 milliseconds (cumulative count 99990) 288s 99.994% <= 1.839 milliseconds (cumulative count 100000) 288s 100.000% <= 1.839 milliseconds (cumulative count 100000) 288s 288s Cumulative distribution of latencies: 288s 0.000% <= 0.103 milliseconds (cumulative count 0) 288s 0.130% <= 0.407 milliseconds (cumulative count 130) 288s 0.230% <= 0.503 milliseconds (cumulative count 230) 288s 0.270% <= 0.607 milliseconds (cumulative count 270) 288s 0.820% <= 0.703 milliseconds (cumulative count 820) 288s 13.920% <= 0.807 milliseconds (cumulative count 13920) 288s 25.130% <= 0.903 milliseconds (cumulative count 25130) 288s 32.650% <= 1.007 milliseconds (cumulative count 32650) 288s 50.530% <= 1.103 milliseconds (cumulative count 50530) 288s 70.830% <= 1.207 milliseconds (cumulative count 70830) 288s 85.890% <= 1.303 milliseconds (cumulative count 85890) 288s 95.910% <= 1.407 milliseconds (cumulative count 95910) 288s 98.830% <= 1.503 milliseconds (cumulative count 98830) 288s 99.700% <= 1.607 milliseconds (cumulative count 99700) 288s 99.960% <= 1.703 milliseconds (cumulative count 99960) 288s 99.990% <= 1.807 milliseconds (cumulative count 99990) 288s 100.000% <= 1.903 milliseconds (cumulative count 100000) 288s 288s Summary: 288s throughput summary: 427350.44 requests per second 288s latency summary (msec): 288s avg min p50 p95 p99 max 288s 1.082 0.328 1.103 1.391 1.519 1.839 288s 288s autopkgtest [12:26:13]: test 0002-benchmark: -----------------------] 288s 0002-benchmark PASS 288s autopkgtest [12:26:13]: test 0002-benchmark: - - - - - - - - - - results - - - - - - - - - - 289s autopkgtest [12:26:14]: test 0003-valkey-check-aof: preparing testbed 289s Reading package lists... 289s Building dependency tree... 289s Reading state information... 289s Starting pkgProblemResolver with broken count: 0 290s Starting 2 pkgProblemResolver with broken count: 0 290s Done 290s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 291s autopkgtest [12:26:16]: test 0003-valkey-check-aof: [----------------------- 292s autopkgtest [12:26:17]: test 0003-valkey-check-aof: -----------------------] 292s 0003-valkey-check-aof PASS 292s autopkgtest [12:26:17]: test 0003-valkey-check-aof: - - - - - - - - - - results - - - - - - - - - - 294s autopkgtest [12:26:19]: test 0004-valkey-check-rdb: preparing testbed 294s Reading package lists... 295s Building dependency tree... 295s Reading state information... 295s Starting pkgProblemResolver with broken count: 0 295s Starting 2 pkgProblemResolver with broken count: 0 295s Done 296s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 297s autopkgtest [12:26:22]: test 0004-valkey-check-rdb: [----------------------- 302s OK 302s [offset 0] Checking RDB file /var/lib/valkey/dump.rdb 302s [offset 27] AUX FIELD valkey-ver = '8.0.2' 302s [offset 41] AUX FIELD redis-bits = '64' 302s [offset 53] AUX FIELD ctime = '1742041587' 302s [offset 68] AUX FIELD used-mem = '3067800' 302s [offset 80] AUX FIELD aof-base = '0' 302s [offset 82] Selecting DB ID 0 302s [offset 565562] Checksum OK 302s [offset 565562] \o/ RDB looks OK! \o/ 302s [info] 5 keys read 302s [info] 0 expires 302s [info] 0 already expired 302s autopkgtest [12:26:27]: test 0004-valkey-check-rdb: -----------------------] 303s autopkgtest [12:26:28]: test 0004-valkey-check-rdb: - - - - - - - - - - results - - - - - - - - - - 303s 0004-valkey-check-rdb PASS 303s autopkgtest [12:26:28]: test 0005-cjson: preparing testbed 303s Reading package lists... 304s Building dependency tree... 304s Reading state information... 304s Starting pkgProblemResolver with broken count: 0 304s Starting 2 pkgProblemResolver with broken count: 0 304s Done 305s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 306s autopkgtest [12:26:31]: test 0005-cjson: [----------------------- 311s 311s autopkgtest [12:26:36]: test 0005-cjson: -----------------------] 312s 0005-cjson PASS 312s autopkgtest [12:26:37]: test 0005-cjson: - - - - - - - - - - results - - - - - - - - - - 312s autopkgtest [12:26:37]: test 0006-migrate-from-redis: preparing testbed 499s autopkgtest [12:29:44]: testbed dpkg architecture: arm64 499s autopkgtest [12:29:44]: testbed apt version: 2.9.33 500s autopkgtest [12:29:45]: @@@@@@@@@@@@@@@@@@@@ test bed setup 500s autopkgtest [12:29:45]: testbed release detected to be: plucky 501s autopkgtest [12:29:46]: updating testbed package index (apt update) 501s Get:1 http://ftpmaster.internal/ubuntu plucky-proposed InRelease [126 kB] 501s Hit:2 http://ftpmaster.internal/ubuntu plucky InRelease 501s Hit:3 http://ftpmaster.internal/ubuntu plucky-updates InRelease 501s Hit:4 http://ftpmaster.internal/ubuntu plucky-security InRelease 502s Get:5 http://ftpmaster.internal/ubuntu plucky-proposed/main Sources [46.2 kB] 502s Get:6 http://ftpmaster.internal/ubuntu plucky-proposed/multiverse Sources [15.8 kB] 502s Get:7 http://ftpmaster.internal/ubuntu plucky-proposed/universe Sources [410 kB] 502s Get:8 http://ftpmaster.internal/ubuntu plucky-proposed/main arm64 Packages [78.2 kB] 502s Get:9 http://ftpmaster.internal/ubuntu plucky-proposed/main arm64 c-n-f Metadata [1888 B] 502s Get:10 http://ftpmaster.internal/ubuntu plucky-proposed/restricted arm64 c-n-f Metadata [116 B] 502s Get:11 http://ftpmaster.internal/ubuntu plucky-proposed/universe arm64 Packages [353 kB] 502s Get:12 http://ftpmaster.internal/ubuntu plucky-proposed/universe arm64 c-n-f Metadata [15.7 kB] 502s Get:13 http://ftpmaster.internal/ubuntu plucky-proposed/multiverse arm64 Packages [4948 B] 502s Get:14 http://ftpmaster.internal/ubuntu plucky-proposed/multiverse arm64 c-n-f Metadata [572 B] 503s Fetched 1052 kB in 2s (648 kB/s) 504s Reading package lists... 505s Reading package lists... 505s Building dependency tree... 505s Reading state information... 505s Calculating upgrade... 506s Calculating upgrade... 506s The following packages will be upgraded: 506s python3-jinja2 strace 506s 2 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 506s Need to get 608 kB of archives. 506s After this operation, 11.3 kB of additional disk space will be used. 506s Get:1 http://ftpmaster.internal/ubuntu plucky/main arm64 strace arm64 6.13+ds-1ubuntu1 [499 kB] 507s Get:2 http://ftpmaster.internal/ubuntu plucky/main arm64 python3-jinja2 all 3.1.5-2ubuntu1 [109 kB] 508s Fetched 608 kB in 1s (576 kB/s) 508s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117701 files and directories currently installed.) 508s Preparing to unpack .../strace_6.13+ds-1ubuntu1_arm64.deb ... 508s Unpacking strace (6.13+ds-1ubuntu1) over (6.11-0ubuntu1) ... 508s Preparing to unpack .../python3-jinja2_3.1.5-2ubuntu1_all.deb ... 508s Unpacking python3-jinja2 (3.1.5-2ubuntu1) over (3.1.5-2) ... 508s Setting up python3-jinja2 (3.1.5-2ubuntu1) ... 508s Setting up strace (6.13+ds-1ubuntu1) ... 508s Processing triggers for man-db (2.13.0-1) ... 509s Reading package lists... 509s Building dependency tree... 509s Reading state information... 510s Solving dependencies... 510s The following packages will be REMOVED: 510s libnsl2* libpython3.12-minimal* libpython3.12-stdlib* libpython3.12t64* 510s libunwind8* linux-headers-6.11.0-8* linux-headers-6.11.0-8-generic* 510s linux-image-6.11.0-8-generic* linux-modules-6.11.0-8-generic* 510s linux-tools-6.11.0-8* linux-tools-6.11.0-8-generic* 511s 0 upgraded, 0 newly installed, 11 to remove and 5 not upgraded. 511s After this operation, 267 MB disk space will be freed. 511s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117701 files and directories currently installed.) 511s Removing linux-tools-6.11.0-8-generic (6.11.0-8.8) ... 511s Removing linux-tools-6.11.0-8 (6.11.0-8.8) ... 511s Removing libpython3.12t64:arm64 (3.12.9-1) ... 511s Removing libpython3.12-stdlib:arm64 (3.12.9-1) ... 511s Removing libnsl2:arm64 (1.3.0-3build3) ... 511s Removing libpython3.12-minimal:arm64 (3.12.9-1) ... 511s Removing libunwind8:arm64 (1.6.2-3.1) ... 511s Removing linux-headers-6.11.0-8-generic (6.11.0-8.8) ... 512s Removing linux-headers-6.11.0-8 (6.11.0-8.8) ... 514s Removing linux-image-6.11.0-8-generic (6.11.0-8.8) ... 514s I: /boot/vmlinuz.old is now a symlink to vmlinuz-6.14.0-10-generic 514s I: /boot/initrd.img.old is now a symlink to initrd.img-6.14.0-10-generic 514s /etc/kernel/postrm.d/initramfs-tools: 514s update-initramfs: Deleting /boot/initrd.img-6.11.0-8-generic 514s /etc/kernel/postrm.d/zz-flash-kernel: 514s flash-kernel: Kernel 6.11.0-8-generic has been removed. 514s flash-kernel: A higher version (6.14.0-10-generic) is still installed, no reflashing required. 514s /etc/kernel/postrm.d/zz-update-grub: 514s Sourcing file `/etc/default/grub' 514s Sourcing file `/etc/default/grub.d/50-cloudimg-settings.cfg' 514s Generating grub configuration file ... 514s Found linux image: /boot/vmlinuz-6.14.0-10-generic 514s Found initrd image: /boot/initrd.img-6.14.0-10-generic 515s Warning: os-prober will not be executed to detect other bootable partitions. 515s Systems on them will not be added to the GRUB boot configuration. 515s Check GRUB_DISABLE_OS_PROBER documentation entry. 515s Adding boot menu entry for UEFI Firmware Settings ... 515s done 515s Removing linux-modules-6.11.0-8-generic (6.11.0-8.8) ... 515s Processing triggers for libc-bin (2.41-1ubuntu1) ... 515s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81650 files and directories currently installed.) 515s Purging configuration files for linux-image-6.11.0-8-generic (6.11.0-8.8) ... 515s Purging configuration files for libpython3.12-minimal:arm64 (3.12.9-1) ... 515s Purging configuration files for linux-modules-6.11.0-8-generic (6.11.0-8.8) ... 516s autopkgtest [12:30:01]: upgrading testbed (apt dist-upgrade and autopurge) 516s Reading package lists... 516s Building dependency tree... 516s Reading state information... 517s Calculating upgrade...Starting pkgProblemResolver with broken count: 0 517s Starting 2 pkgProblemResolver with broken count: 0 517s Done 517s Entering ResolveByKeep 518s 518s Calculating upgrade... 518s The following packages will be upgraded: 518s libc-bin libc-dev-bin libc6 libc6-dev locales 518s 5 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 518s Need to get 9530 kB of archives. 518s After this operation, 0 B of additional disk space will be used. 518s Get:1 http://ftpmaster.internal/ubuntu plucky-proposed/main arm64 libc6-dev arm64 2.41-1ubuntu2 [1750 kB] 521s Get:2 http://ftpmaster.internal/ubuntu plucky-proposed/main arm64 libc-dev-bin arm64 2.41-1ubuntu2 [24.0 kB] 521s Get:3 http://ftpmaster.internal/ubuntu plucky-proposed/main arm64 libc6 arm64 2.41-1ubuntu2 [2910 kB] 524s Get:4 http://ftpmaster.internal/ubuntu plucky-proposed/main arm64 libc-bin arm64 2.41-1ubuntu2 [600 kB] 525s Get:5 http://ftpmaster.internal/ubuntu plucky-proposed/main arm64 locales all 2.41-1ubuntu2 [4246 kB] 530s Preconfiguring packages ... 530s Fetched 9530 kB in 12s (824 kB/s) 530s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81647 files and directories currently installed.) 530s Preparing to unpack .../libc6-dev_2.41-1ubuntu2_arm64.deb ... 530s Unpacking libc6-dev:arm64 (2.41-1ubuntu2) over (2.41-1ubuntu1) ... 530s Preparing to unpack .../libc-dev-bin_2.41-1ubuntu2_arm64.deb ... 530s Unpacking libc-dev-bin (2.41-1ubuntu2) over (2.41-1ubuntu1) ... 531s Preparing to unpack .../libc6_2.41-1ubuntu2_arm64.deb ... 531s Unpacking libc6:arm64 (2.41-1ubuntu2) over (2.41-1ubuntu1) ... 531s Setting up libc6:arm64 (2.41-1ubuntu2) ... 531s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81647 files and directories currently installed.) 531s Preparing to unpack .../libc-bin_2.41-1ubuntu2_arm64.deb ... 531s Unpacking libc-bin (2.41-1ubuntu2) over (2.41-1ubuntu1) ... 531s Setting up libc-bin (2.41-1ubuntu2) ... 531s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81647 files and directories currently installed.) 531s Preparing to unpack .../locales_2.41-1ubuntu2_all.deb ... 531s Unpacking locales (2.41-1ubuntu2) over (2.41-1ubuntu1) ... 532s Setting up locales (2.41-1ubuntu2) ... 532s Generating locales (this might take a while)... 534s en_US.UTF-8... done 534s Generation complete. 534s Setting up libc-dev-bin (2.41-1ubuntu2) ... 534s Setting up libc6-dev:arm64 (2.41-1ubuntu2) ... 534s Processing triggers for man-db (2.13.0-1) ... 535s Processing triggers for systemd (257.3-1ubuntu3) ... 536s Reading package lists... 537s Building dependency tree... 537s Reading state information... 537s Starting pkgProblemResolver with broken count: 0 537s Starting 2 pkgProblemResolver with broken count: 0 537s Done 537s Solving dependencies... 538s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 538s autopkgtest [12:30:23]: rebooting testbed after setup commands that affected boot 564s Reading package lists... 565s Building dependency tree... 565s Reading state information... 565s Starting pkgProblemResolver with broken count: 0 565s Starting 2 pkgProblemResolver with broken count: 0 565s Done 566s The following NEW packages will be installed: 566s liblzf1 redis-sentinel redis-server redis-tools 566s 0 upgraded, 4 newly installed, 0 to remove and 0 not upgraded. 566s Need to get 1236 kB of archives. 566s After this operation, 7268 kB of additional disk space will be used. 566s Get:1 http://ftpmaster.internal/ubuntu plucky/universe arm64 liblzf1 arm64 3.6-4 [7426 B] 566s Get:2 http://ftpmaster.internal/ubuntu plucky/universe arm64 redis-tools arm64 5:7.0.15-3 [1165 kB] 568s Get:3 http://ftpmaster.internal/ubuntu plucky/universe arm64 redis-sentinel arm64 5:7.0.15-3 [12.2 kB] 568s Get:4 http://ftpmaster.internal/ubuntu plucky/universe arm64 redis-server arm64 5:7.0.15-3 [51.7 kB] 568s Fetched 1236 kB in 2s (681 kB/s) 568s Selecting previously unselected package liblzf1:arm64. 569s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81647 files and directories currently installed.) 569s Preparing to unpack .../liblzf1_3.6-4_arm64.deb ... 569s Unpacking liblzf1:arm64 (3.6-4) ... 569s Selecting previously unselected package redis-tools. 569s Preparing to unpack .../redis-tools_5%3a7.0.15-3_arm64.deb ... 569s Unpacking redis-tools (5:7.0.15-3) ... 569s Selecting previously unselected package redis-sentinel. 569s Preparing to unpack .../redis-sentinel_5%3a7.0.15-3_arm64.deb ... 569s Unpacking redis-sentinel (5:7.0.15-3) ... 569s Selecting previously unselected package redis-server. 569s Preparing to unpack .../redis-server_5%3a7.0.15-3_arm64.deb ... 569s Unpacking redis-server (5:7.0.15-3) ... 569s Setting up liblzf1:arm64 (3.6-4) ... 569s Setting up redis-tools (5:7.0.15-3) ... 569s Setting up redis-server (5:7.0.15-3) ... 569s Created symlink '/etc/systemd/system/redis.service' → '/usr/lib/systemd/system/redis-server.service'. 569s Created symlink '/etc/systemd/system/multi-user.target.wants/redis-server.service' → '/usr/lib/systemd/system/redis-server.service'. 570s Setting up redis-sentinel (5:7.0.15-3) ... 570s Created symlink '/etc/systemd/system/sentinel.service' → '/usr/lib/systemd/system/redis-sentinel.service'. 570s Created symlink '/etc/systemd/system/multi-user.target.wants/redis-sentinel.service' → '/usr/lib/systemd/system/redis-sentinel.service'. 571s Processing triggers for man-db (2.13.0-1) ... 571s Processing triggers for libc-bin (2.41-1ubuntu2) ... 579s autopkgtest [12:31:04]: test 0006-migrate-from-redis: [----------------------- 579s + FLAG_FILE=/etc/valkey/REDIS_MIGRATION 579s + sed -i 's#loglevel notice#loglevel debug#' /etc/redis/redis.conf 579s + systemctl restart redis-server 579s + redis-cli -h 127.0.0.1 -p 6379 SET test 1 579s OK 579s + redis-cli -h 127.0.0.1 -p 6379 GET test 579s 1 579s + redis-cli -h 127.0.0.1 -p 6379 SAVE 579s OK 579s + sha256sum /var/lib/redis/dump.rdb 579s 369987b7cb14d71e73b983e26be2ab647441fffd58b894b126b3f07c4c3f20ca /var/lib/redis/dump.rdb 579s + apt-get install -y valkey-redis-compat 579s Reading package lists... 579s Building dependency tree... 579s Reading state information... 580s Solving dependencies... 580s The following additional packages will be installed: 580s valkey-server valkey-tools 580s Suggested packages: 580s ruby-redis 580s The following packages will be REMOVED: 580s redis-sentinel redis-server redis-tools 580s The following NEW packages will be installed: 580s valkey-redis-compat valkey-server valkey-tools 580s 0 upgraded, 3 newly installed, 3 to remove and 0 not upgraded. 580s Need to get 1308 kB of archives. 580s After this operation, 270 kB of additional disk space will be used. 580s Get:1 http://ftpmaster.internal/ubuntu plucky/universe arm64 valkey-tools arm64 8.0.2+dfsg1-1ubuntu1 [1252 kB] 582s Get:2 http://ftpmaster.internal/ubuntu plucky/universe arm64 valkey-server arm64 8.0.2+dfsg1-1ubuntu1 [48.5 kB] 582s Get:3 http://ftpmaster.internal/ubuntu plucky/universe arm64 valkey-redis-compat all 8.0.2+dfsg1-1ubuntu1 [7744 B] 582s Fetched 1308 kB in 2s (764 kB/s) 582s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81698 files and directories currently installed.) 582s Removing redis-sentinel (5:7.0.15-3) ... 583s Removing redis-server (5:7.0.15-3) ... 583s Removing redis-tools (5:7.0.15-3) ... 583s Selecting previously unselected package valkey-tools. 583s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81661 files and directories currently installed.) 583s Preparing to unpack .../valkey-tools_8.0.2+dfsg1-1ubuntu1_arm64.deb ... 583s Unpacking valkey-tools (8.0.2+dfsg1-1ubuntu1) ... 583s Selecting previously unselected package valkey-server. 583s Preparing to unpack .../valkey-server_8.0.2+dfsg1-1ubuntu1_arm64.deb ... 583s Unpacking valkey-server (8.0.2+dfsg1-1ubuntu1) ... 584s Selecting previously unselected package valkey-redis-compat. 584s Preparing to unpack .../valkey-redis-compat_8.0.2+dfsg1-1ubuntu1_all.deb ... 584s Unpacking valkey-redis-compat (8.0.2+dfsg1-1ubuntu1) ... 584s Setting up valkey-tools (8.0.2+dfsg1-1ubuntu1) ... 584s Setting up valkey-server (8.0.2+dfsg1-1ubuntu1) ... 584s Created symlink '/etc/systemd/system/valkey.service' → '/usr/lib/systemd/system/valkey-server.service'. 584s Created symlink '/etc/systemd/system/multi-user.target.wants/valkey-server.service' → '/usr/lib/systemd/system/valkey-server.service'. 585s Setting up valkey-redis-compat (8.0.2+dfsg1-1ubuntu1) ... 585s dpkg-query: no packages found matching valkey-sentinel 585s [I] /etc/redis/redis.conf has been copied to /etc/valkey/valkey.conf. Please, review the content of valkey.conf, especially if you had modified redis.conf. 585s [I] /etc/redis/sentinel.conf has been copied to /etc/valkey/sentinel.conf. Please, review the content of sentinel.conf, especially if you had modified sentinel.conf. 585s [I] On-disk redis dumps moved from /var/lib/redis/ to /var/lib/valkey. 585s Processing triggers for man-db (2.13.0-1) ... 585s + '[' -f /etc/valkey/REDIS_MIGRATION ']' 585s + sha256sum /var/lib/valkey/dump.rdb 585s f13f81261b78a12cc640d68461e43aeb6747f1cced4d88e2fb63c48b6e725d6e /var/lib/valkey/dump.rdb 585s + systemctl status valkey-server 585s + grep inactive 585s Active: inactive (dead) since Sat 2025-03-15 12:31:10 UTC; 584ms ago 585s + rm /etc/valkey/REDIS_MIGRATION 585s + systemctl start valkey-server 586s + systemctl status valkey-server 586s Active: active (running) since Sat 2025-03-15 12:31:11 UTC; 10ms ago 586s + grep running 586s + sha256sum /var/lib/valkey/dump.rdb 586s + cat /etc/valkey/valkey.conf 586s + grep loglevel 586s + grep debug 586s + valkey-cli -h 127.0.0.1 -p 6379 GET test 586s + grep 1 586s autopkgtest [12:31:11]: test 0006-migrate-from-redis: -----------------------] 586s f13f81261b78a12cc640d68461e43aeb6747f1cced4d88e2fb63c48b6e725d6e /var/lib/valkey/dump.rdb 586s loglevel debug 586s 1 587s autopkgtest [12:31:12]: test 0006-migrate-from-redis: - - - - - - - - - - results - - - - - - - - - - 587s 0006-migrate-from-redis PASS 587s autopkgtest [12:31:12]: @@@@@@@@@@@@@@@@@@@@ summary 587s 0001-valkey-cli PASS 587s 0002-benchmark PASS 587s 0003-valkey-check-aof PASS 587s 0004-valkey-check-rdb PASS 587s 0005-cjson PASS 587s 0006-migrate-from-redis PASS 592s nova [W] Using flock in prodstack6-arm64 592s Creating nova instance adt-plucky-arm64-valkey-20250315-122125-juju-7f2275-prod-proposed-migration-environment-20-48b98757-de80-40e4-95c3-65398558a9c0 from image adt/ubuntu-plucky-arm64-server-20250315.img (UUID bd6e766c-b51f-4b53-86d6-23aa4d18f524)... 592s nova [W] Timed out waiting for e989b6c2-b740-4d9d-96cb-59254177d97e to get deleted. 592s nova [W] Using flock in prodstack6-arm64 592s flock: timeout while waiting to get lock 592s Creating nova instance adt-plucky-arm64-valkey-20250315-122125-juju-7f2275-prod-proposed-migration-environment-20-48b98757-de80-40e4-95c3-65398558a9c0 from image adt/ubuntu-plucky-arm64-server-20250315.img (UUID bd6e766c-b51f-4b53-86d6-23aa4d18f524)... 592s nova [W] Timed out waiting for ad3d4b8c-3eeb-4bbd-bb4f-3ea67df9281b to get deleted.