0s autopkgtest [21:01:47]: starting date and time: 2025-07-27 21:01:47+0000 0s autopkgtest [21:01:47]: git checkout: 508d4a25 a-v-ssh wait_for_ssh: demote "ssh connection failed" to a debug message 0s autopkgtest [21:01:47]: host juju-7f2275-prod-proposed-migration-environment-23; command line: /home/ubuntu/autopkgtest/runner/autopkgtest --output-dir /tmp/autopkgtest-work.4jr3l3bt/out --timeout-copy=6000 --setup-commands /home/ubuntu/autopkgtest-cloud/worker-config-production/setup-canonical.sh --apt-pocket=proposed=src:redis --apt-upgrade valkey --timeout-short=300 --timeout-copy=20000 --timeout-build=20000 --env=ADT_TEST_TRIGGERS=redis/5:8.0.2-3 -- ssh -s /home/ubuntu/autopkgtest/ssh-setup/nova -- --flavor builder-cpu2-ram4-disk20 --security-groups autopkgtest-juju-7f2275-prod-proposed-migration-environment-23@bos03-1.secgroup --name adt-questing-amd64-valkey-20250727-195506-juju-7f2275-prod-proposed-migration-environment-23-1ed870bd-c645-41fe-9c20-47649be53bf2 --image adt/ubuntu-questing-amd64-server --keyname testbed-juju-7f2275-prod-proposed-migration-environment-23 --net-id=net_prod-proposed-migration-amd64 -e TERM=linux --mirror=http://ftpmaster.internal/ubuntu/ 3s Creating nova instance adt-questing-amd64-valkey-20250727-195506-juju-7f2275-prod-proposed-migration-environment-23-1ed870bd-c645-41fe-9c20-47649be53bf2 from image adt/ubuntu-questing-amd64-server-20250727.img (UUID 19a66749-1393-4666-8e85-1bb5b7c6ee26)... 48s autopkgtest [21:02:35]: testbed dpkg architecture: amd64 48s autopkgtest [21:02:35]: testbed apt version: 3.1.3 48s autopkgtest [21:02:35]: @@@@@@@@@@@@@@@@@@@@ test bed setup 48s autopkgtest [21:02:35]: testbed release detected to be: None 49s autopkgtest [21:02:36]: updating testbed package index (apt update) 50s Get:1 http://ftpmaster.internal/ubuntu questing-proposed InRelease [249 kB] 50s Hit:2 http://ftpmaster.internal/ubuntu questing InRelease 50s Hit:3 http://ftpmaster.internal/ubuntu questing-updates InRelease 50s Hit:4 http://ftpmaster.internal/ubuntu questing-security InRelease 50s Get:5 http://ftpmaster.internal/ubuntu questing-proposed/main Sources [30.7 kB] 50s Get:6 http://ftpmaster.internal/ubuntu questing-proposed/universe Sources [149 kB] 50s Get:7 http://ftpmaster.internal/ubuntu questing-proposed/multiverse Sources [13.0 kB] 50s Get:8 http://ftpmaster.internal/ubuntu questing-proposed/main amd64 Packages [52.3 kB] 50s Get:9 http://ftpmaster.internal/ubuntu questing-proposed/main i386 Packages [42.1 kB] 50s Get:10 http://ftpmaster.internal/ubuntu questing-proposed/universe amd64 Packages [138 kB] 50s Get:11 http://ftpmaster.internal/ubuntu questing-proposed/universe i386 Packages [80.3 kB] 50s Get:12 http://ftpmaster.internal/ubuntu questing-proposed/multiverse amd64 Packages [4696 B] 50s Get:13 http://ftpmaster.internal/ubuntu questing-proposed/multiverse i386 Packages [8000 B] 50s Fetched 768 kB in 1s (824 kB/s) 51s Reading package lists... 52s autopkgtest [21:02:39]: upgrading testbed (apt dist-upgrade and autopurge) 52s Reading package lists... 52s Building dependency tree... 52s Reading state information... 53s Calculating upgrade... 53s The following packages will be upgraded: 53s iputils-ping iputils-tracepath libpam-modules libpam-modules-bin 53s libpam-runtime libpam0g libxml2-16 rsync usb.ids 53s 9 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 53s Need to get 1793 kB of archives. 53s After this operation, 140 kB of additional disk space will be used. 53s Get:1 http://ftpmaster.internal/ubuntu questing/main amd64 libpam0g amd64 1.7.0-5ubuntu1 [69.5 kB] 53s Get:2 http://ftpmaster.internal/ubuntu questing/main amd64 libpam-modules-bin amd64 1.7.0-5ubuntu1 [45.6 kB] 53s Get:3 http://ftpmaster.internal/ubuntu questing/main amd64 libpam-modules amd64 1.7.0-5ubuntu1 [192 kB] 53s Get:4 http://ftpmaster.internal/ubuntu questing/main amd64 rsync amd64 3.4.1+ds1-5 [445 kB] 53s Get:5 http://ftpmaster.internal/ubuntu questing/main amd64 libpam-runtime all 1.7.0-5ubuntu1 [149 kB] 53s Get:6 http://ftpmaster.internal/ubuntu questing/main amd64 iputils-ping amd64 3:20240905-3ubuntu2 [46.4 kB] 53s Get:7 http://ftpmaster.internal/ubuntu questing/main amd64 libxml2-16 amd64 2.14.5+dfsg-0exp1 [607 kB] 53s Get:8 http://ftpmaster.internal/ubuntu questing/main amd64 iputils-tracepath amd64 3:20240905-3ubuntu2 [14.5 kB] 53s Get:9 http://ftpmaster.internal/ubuntu questing/main amd64 usb.ids all 2025.07.26-1 [224 kB] 54s Preconfiguring packages ... 54s Fetched 1793 kB in 1s (2120 kB/s) 54s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81855 files and directories currently installed.) 54s Preparing to unpack .../libpam0g_1.7.0-5ubuntu1_amd64.deb ... 54s Unpacking libpam0g:amd64 (1.7.0-5ubuntu1) over (1.5.3-7ubuntu6) ... 54s Setting up libpam0g:amd64 (1.7.0-5ubuntu1) ... 54s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81855 files and directories currently installed.) 54s Preparing to unpack .../libpam-modules-bin_1.7.0-5ubuntu1_amd64.deb ... 54s Unpacking libpam-modules-bin (1.7.0-5ubuntu1) over (1.5.3-7ubuntu6) ... 55s Setting up libpam-modules-bin (1.7.0-5ubuntu1) ... 55s pam_namespace.service is a disabled or a static unit not running, not starting it. 55s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81847 files and directories currently installed.) 55s Preparing to unpack .../libpam-modules_1.7.0-5ubuntu1_amd64.deb ... 55s Unpacking libpam-modules:amd64 (1.7.0-5ubuntu1) over (1.5.3-7ubuntu6) ... 55s Setting up libpam-modules:amd64 (1.7.0-5ubuntu1) ... 55s Installing new version of config file /etc/security/access.conf ... 55s Installing new version of config file /etc/security/pwhistory.conf ... 55s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81794 files and directories currently installed.) 55s Preparing to unpack .../rsync_3.4.1+ds1-5_amd64.deb ... 55s Unpacking rsync (3.4.1+ds1-5) over (3.4.1+ds1-4) ... 55s Preparing to unpack .../libpam-runtime_1.7.0-5ubuntu1_all.deb ... 55s Unpacking libpam-runtime (1.7.0-5ubuntu1) over (1.5.3-7ubuntu6) ... 55s Setting up libpam-runtime (1.7.0-5ubuntu1) ... 56s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81857 files and directories currently installed.) 56s Preparing to unpack .../iputils-ping_3%3a20240905-3ubuntu2_amd64.deb ... 56s Unpacking iputils-ping (3:20240905-3ubuntu2) over (3:20240905-3ubuntu1) ... 56s Preparing to unpack .../libxml2-16_2.14.5+dfsg-0exp1_amd64.deb ... 56s Unpacking libxml2-16:amd64 (2.14.5+dfsg-0exp1) over (2.14.4+dfsg-0exp1) ... 56s Preparing to unpack .../iputils-tracepath_3%3a20240905-3ubuntu2_amd64.deb ... 56s Unpacking iputils-tracepath (3:20240905-3ubuntu2) over (3:20240905-3ubuntu1) ... 56s Preparing to unpack .../usb.ids_2025.07.26-1_all.deb ... 56s Unpacking usb.ids (2025.07.26-1) over (2025.04.01-1) ... 56s Setting up libxml2-16:amd64 (2.14.5+dfsg-0exp1) ... 56s Setting up usb.ids (2025.07.26-1) ... 56s Setting up iputils-ping (3:20240905-3ubuntu2) ... 56s Setting up iputils-tracepath (3:20240905-3ubuntu2) ... 56s Setting up rsync (3.4.1+ds1-5) ... 57s rsync.service is a disabled or a static unit not running, not starting it. 57s Processing triggers for man-db (2.13.1-1) ... 58s Processing triggers for libc-bin (2.41-6ubuntu2) ... 58s Reading package lists... 59s Building dependency tree... 59s Reading state information... 59s Solving dependencies... 59s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 59s autopkgtest [21:02:46]: rebooting testbed after setup commands that affected boot 80s autopkgtest [21:03:07]: testbed running kernel: Linux 6.15.0-4-generic #4-Ubuntu SMP PREEMPT_DYNAMIC Fri Jul 4 14:41:53 UTC 2025 82s autopkgtest [21:03:09]: @@@@@@@@@@@@@@@@@@@@ apt-source valkey 86s Get:1 http://ftpmaster.internal/ubuntu questing/universe valkey 8.1.3+dfsg1-0ubuntu1 (dsc) [2484 B] 86s Get:2 http://ftpmaster.internal/ubuntu questing/universe valkey 8.1.3+dfsg1-0ubuntu1 (tar) [2729 kB] 86s Get:3 http://ftpmaster.internal/ubuntu questing/universe valkey 8.1.3+dfsg1-0ubuntu1 (diff) [20.7 kB] 87s gpgv: Signature made Wed Jul 9 19:54:08 2025 UTC 87s gpgv: using RSA key 63EEFC3DE14D5146CE7F24BF34B8AD7D9529E793 87s gpgv: issuer "lena.voytek@canonical.com" 87s gpgv: Can't check signature: No public key 87s dpkg-source: warning: cannot verify inline signature for ./valkey_8.1.3+dfsg1-0ubuntu1.dsc: no acceptable signature found 87s autopkgtest [21:03:14]: testing package valkey version 8.1.3+dfsg1-0ubuntu1 88s autopkgtest [21:03:15]: build not needed 90s autopkgtest [21:03:17]: test 0001-valkey-cli: preparing testbed 90s Reading package lists... 91s Building dependency tree... 91s Reading state information... 91s Solving dependencies... 91s The following NEW packages will be installed: 91s liblzf1 valkey-server valkey-tools 91s 0 upgraded, 3 newly installed, 0 to remove and 0 not upgraded. 91s Need to get 1359 kB of archives. 91s After this operation, 7322 kB of additional disk space will be used. 91s Get:1 http://ftpmaster.internal/ubuntu questing/universe amd64 liblzf1 amd64 3.6-4 [7624 B] 91s Get:2 http://ftpmaster.internal/ubuntu questing/universe amd64 valkey-tools amd64 8.1.3+dfsg1-0ubuntu1 [1300 kB] 92s Get:3 http://ftpmaster.internal/ubuntu questing/universe amd64 valkey-server amd64 8.1.3+dfsg1-0ubuntu1 [51.7 kB] 92s Fetched 1359 kB in 1s (1899 kB/s) 92s Selecting previously unselected package liblzf1:amd64. 92s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81857 files and directories currently installed.) 92s Preparing to unpack .../liblzf1_3.6-4_amd64.deb ... 92s Unpacking liblzf1:amd64 (3.6-4) ... 92s Selecting previously unselected package valkey-tools. 92s Preparing to unpack .../valkey-tools_8.1.3+dfsg1-0ubuntu1_amd64.deb ... 92s Unpacking valkey-tools (8.1.3+dfsg1-0ubuntu1) ... 92s Selecting previously unselected package valkey-server. 92s Preparing to unpack .../valkey-server_8.1.3+dfsg1-0ubuntu1_amd64.deb ... 92s Unpacking valkey-server (8.1.3+dfsg1-0ubuntu1) ... 92s Setting up liblzf1:amd64 (3.6-4) ... 92s Setting up valkey-tools (8.1.3+dfsg1-0ubuntu1) ... 93s Setting up valkey-server (8.1.3+dfsg1-0ubuntu1) ... 93s Created symlink '/etc/systemd/system/valkey.service' → '/usr/lib/systemd/system/valkey-server.service'. 93s Created symlink '/etc/systemd/system/multi-user.target.wants/valkey-server.service' → '/usr/lib/systemd/system/valkey-server.service'. 93s Processing triggers for man-db (2.13.1-1) ... 94s Processing triggers for libc-bin (2.41-6ubuntu2) ... 95s autopkgtest [21:03:22]: test 0001-valkey-cli: [----------------------- 101s # Server 101s redis_version:7.2.4 101s server_name:valkey 101s valkey_version:8.1.3 101s valkey_release_stage:ga 101s redis_git_sha1:00000000 101s redis_git_dirty:0 101s redis_build_id:41709310f77be544 101s server_mode:standalone 101s os:Linux 6.15.0-4-generic x86_64 101s arch_bits:64 101s monotonic_clock:POSIX clock_gettime 101s multiplexing_api:epoll 101s gcc_version:14.3.0 101s process_id:1706 101s process_supervised:systemd 101s run_id:1f2437c210b0583a5018dacfb491da62bfed1079 101s tcp_port:6379 101s server_time_usec:1753650207991877 101s uptime_in_seconds:5 101s uptime_in_days:0 101s hz:10 101s configured_hz:10 101s clients_hz:10 101s lru_clock:8819743 101s executable:/usr/bin/valkey-server 101s config_file:/etc/valkey/valkey.conf 101s io_threads_active:0 101s availability_zone: 101s listener0:name=tcp,bind=127.0.0.1,bind=-::1,port=6379 101s 101s # Clients 101s connected_clients:1 101s cluster_connections:0 101s maxclients:10000 101s client_recent_max_input_buffer:0 101s client_recent_max_output_buffer:0 101s blocked_clients:0 101s tracking_clients:0 101s pubsub_clients:0 101s watching_clients:0 101s clients_in_timeout_table:0 101s total_watched_keys:0 101s total_blocking_keys:0 101s total_blocking_keys_on_nokey:0 101s paused_reason:none 101s paused_actions:none 101s paused_timeout_milliseconds:0 101s 101s # Memory 101s used_memory:911904 101s used_memory_human:890.53K 101s used_memory_rss:14540800 101s used_memory_rss_human:13.87M 101s used_memory_peak:911904 101s used_memory_peak_human:890.53K 101s used_memory_peak_perc:100.31% 101s used_memory_overhead:892000 101s used_memory_startup:891776 101s used_memory_dataset:19904 101s used_memory_dataset_perc:98.89% 101s allocator_allocated:2074752 101s allocator_active:2244608 101s allocator_resident:5009408 101s allocator_muzzy:0 101s total_system_memory:4106387456 101s total_system_memory_human:3.82G 101s used_memory_lua:32768 101s used_memory_vm_eval:32768 101s used_memory_lua_human:32.00K 101s used_memory_scripts_eval:0 101s number_of_cached_scripts:0 101s number_of_functions:0 101s number_of_libraries:0 101s used_memory_vm_functions:33792 101s used_memory_vm_total:66560 101s used_memory_vm_total_human:65.00K 101s used_memory_functions:224 101s used_memory_scripts:224 101s used_memory_scripts_human:224B 101s maxmemory:0 101s maxmemory_human:0B 101s maxmemory_policy:noeviction 101s allocator_frag_ratio:1.00 101s allocator_frag_bytes:0 101s allocator_rss_ratio:2.23 101s allocator_rss_bytes:2764800 101s rss_overhead_ratio:2.90 101s rss_overhead_bytes:9531392 101s mem_fragmentation_ratio:16.30 101s mem_fragmentation_bytes:13648880 101s mem_not_counted_for_evict:0 101s mem_replication_backlog:0 101s mem_total_replication_buffers:0 101s mem_clients_slaves:0 101s mem_clients_normal:0 101s mem_cluster_links:0 101s mem_aof_buffer:0 101s mem_allocator:jemalloc-5.3.0 101s mem_overhead_db_hashtable_rehashing:0 101s active_defrag_running:0 101s lazyfree_pending_objects:0 101s lazyfreed_objects:0 101s 101s # Persistence 101s loading:0 101s async_loading:0 101s current_cow_peak:0 101s current_cow_size:0 101s current_cow_size_age:0 101s current_fork_perc:0.00 101s current_save_keys_processed:0 101s current_save_keys_total:0 101s rdb_changes_since_last_save:0 101s rdb_bgsave_in_progress:0 101s rdb_last_save_time:1753650202 101s rdb_last_bgsave_status:ok 101s rdb_last_bgsave_time_sec:-1 101s rdb_current_bgsave_time_sec:-1 101s rdb_saves:0 101s rdb_last_cow_size:0 101s rdb_last_load_keys_expired:0 101s rdb_last_load_keys_loaded:0 101s aof_enabled:0 101s aof_rewrite_in_progress:0 101s aof_rewrite_scheduled:0 101s aof_last_rewrite_time_sec:-1 101s aof_current_rewrite_time_sec:-1 101s aof_last_bgrewrite_status:ok 101s aof_rewrites:0 101s aof_rewrites_consecutive_failures:0 101s aof_last_write_status:ok 101s aof_last_cow_size:0 101s module_fork_in_progress:0 101s module_fork_last_cow_size:0 101s 101s # Stats 101s total_connections_received:1 101s total_commands_processed:0 101s instantaneous_ops_per_sec:0 101s total_net_input_bytes:14 101s total_net_output_bytes:0 101s total_net_repl_input_bytes:0 101s total_net_repl_output_bytes:0 101s instantaneous_input_kbps:0.00 101s instantaneous_output_kbps:0.00 101s instantaneous_input_repl_kbps:0.00 101s instantaneous_output_repl_kbps:0.00 101s rejected_connections:0 101s sync_full:0 101s sync_partial_ok:0 101s sync_partial_err:0 101s expired_keys:0 101s expired_stale_perc:0.00 101s expired_time_cap_reached_count:0 101s expire_cycle_cpu_milliseconds:0 101s evicted_keys:0 101s evicted_clients:0 101s evicted_scripts:0 101s total_eviction_exceeded_time:0 101s current_eviction_exceeded_time:0 101s keyspace_hits:0 101s keyspace_misses:0 101s pubsub_channels:0 101s pubsub_patterns:0 101s pubsubshard_channels:0 101s latest_fork_usec:0 101s total_forks:0 101s migrate_cached_sockets:0 101s slave_expires_tracked_keys:0 101s active_defrag_hits:0 101s active_defrag_misses:0 101s active_defrag_key_hits:0 101s active_defrag_key_misses:0 101s total_active_defrag_time:0 101s current_active_defrag_time:0 101s tracking_total_keys:0 101s tracking_total_items:0 101s tracking_total_prefixes:0 101s unexpected_error_replies:0 101s total_error_replies:0 101s dump_payload_sanitizations:0 101s total_reads_processed:1 101s total_writes_processed:0 101s io_threaded_reads_processed:0 101s io_threaded_writes_processed:0 101s io_threaded_freed_objects:0 101s io_threaded_accept_processed:0 101s io_threaded_poll_processed:0 101s io_threaded_total_prefetch_batches:0 101s io_threaded_total_prefetch_entries:0 101s client_query_buffer_limit_disconnections:0 101s client_output_buffer_limit_disconnections:0 101s reply_buffer_shrinks:0 101s reply_buffer_expands:0 101s eventloop_cycles:51 101s eventloop_duration_sum:7911 101s eventloop_duration_cmd_sum:0 101s instantaneous_eventloop_cycles_per_sec:9 101s instantaneous_eventloop_duration_usec:185 101s acl_access_denied_auth:0 101s acl_access_denied_cmd:0 101s acl_access_denied_key:0 101s acl_access_denied_channel:0 101s 101s # Replication 101s role:master 101s connected_slaves:0 101s replicas_waiting_psync:0 101s master_failover_state:no-failover 101s master_replid:41bc28e4677ca774bb9e31c6e267446bef50feaa 101s master_replid2:0000000000000000000000000000000000000000 101s master_repl_offset:0 101s second_repl_offset:-1 101s repl_backlog_active:0 101s repl_backlog_size:10485760 101s repl_backlog_first_byte_offset:0 101s repl_backlog_histlen:0 101s 101s # CPU 101s used_cpu_sys:0.045125 101s used_cpu_user:0.034773 101s used_cpu_sys_children:0.001837 101s used_cpu_user_children:0.000000 101s used_cpu_sys_main_thread:0.044532 101s used_cpu_user_main_thread:0.034756 101s 101s # Modules 101s 101s # Errorstats 101s 101s # Cluster 101s cluster_enabled:0 101s 101s # Keyspace 101s Redis ver. 8.1.3 101s autopkgtest [21:03:28]: test 0001-valkey-cli: -----------------------] 101s autopkgtest [21:03:28]: test 0001-valkey-cli: - - - - - - - - - - results - - - - - - - - - - 101s 0001-valkey-cli PASS 102s autopkgtest [21:03:29]: test 0002-benchmark: preparing testbed 102s Reading package lists... 102s Building dependency tree... 102s Reading state information... 102s Solving dependencies... 103s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 104s autopkgtest [21:03:31]: test 0002-benchmark: [----------------------- 109s PING_INLINE: rps=0.0 (overall: 0.0) avg_msec=-nan (overall: -nan) ====== PING_INLINE ====== 109s 100000 requests completed in 0.18 seconds 109s 50 parallel clients 109s 3 bytes payload 109s keep alive: 1 109s host configuration "save": 3600 1 300 100 60 10000 109s host configuration "appendonly": no 109s multi-thread: no 109s 109s Latency by percentile distribution: 109s 0.000% <= 0.231 milliseconds (cumulative count 10) 109s 50.000% <= 0.471 milliseconds (cumulative count 52510) 109s 75.000% <= 0.535 milliseconds (cumulative count 76180) 109s 87.500% <= 0.623 milliseconds (cumulative count 87680) 109s 93.750% <= 0.775 milliseconds (cumulative count 93810) 109s 96.875% <= 0.911 milliseconds (cumulative count 96890) 109s 98.438% <= 1.047 milliseconds (cumulative count 98470) 109s 99.219% <= 1.199 milliseconds (cumulative count 99220) 109s 99.609% <= 1.495 milliseconds (cumulative count 99610) 109s 99.805% <= 2.903 milliseconds (cumulative count 99810) 109s 99.902% <= 2.935 milliseconds (cumulative count 99920) 109s 99.951% <= 2.951 milliseconds (cumulative count 99970) 109s 99.976% <= 2.959 milliseconds (cumulative count 99980) 109s 99.988% <= 2.967 milliseconds (cumulative count 99990) 109s 99.994% <= 2.975 milliseconds (cumulative count 100000) 109s 100.000% <= 2.975 milliseconds (cumulative count 100000) 109s 109s Cumulative distribution of latencies: 109s 0.000% <= 0.103 milliseconds (cumulative count 0) 109s 0.090% <= 0.303 milliseconds (cumulative count 90) 109s 6.630% <= 0.407 milliseconds (cumulative count 6630) 109s 65.420% <= 0.503 milliseconds (cumulative count 65420) 109s 86.670% <= 0.607 milliseconds (cumulative count 86670) 109s 91.170% <= 0.703 milliseconds (cumulative count 91170) 109s 94.800% <= 0.807 milliseconds (cumulative count 94800) 109s 96.740% <= 0.903 milliseconds (cumulative count 96740) 109s 97.940% <= 1.007 milliseconds (cumulative count 97940) 109s 98.840% <= 1.103 milliseconds (cumulative count 98840) 109s 99.240% <= 1.207 milliseconds (cumulative count 99240) 109s 99.440% <= 1.303 milliseconds (cumulative count 99440) 109s 99.550% <= 1.407 milliseconds (cumulative count 99550) 109s 99.620% <= 1.503 milliseconds (cumulative count 99620) 109s 99.650% <= 1.607 milliseconds (cumulative count 99650) 109s 99.680% <= 1.703 milliseconds (cumulative count 99680) 109s 99.710% <= 1.807 milliseconds (cumulative count 99710) 109s 99.730% <= 1.903 milliseconds (cumulative count 99730) 109s 100.000% <= 3.103 milliseconds (cumulative count 100000) 109s 109s Summary: 109s throughput summary: 552486.19 requests per second 109s latency summary (msec): 109s avg min p50 p95 p99 max 109s 0.518 0.224 0.471 0.815 1.135 2.975 109s PING_MBULK: rps=128047.8 (overall: 465797.1) avg_msec=0.594 (overall: 0.594) ====== PING_MBULK ====== 109s 100000 requests completed in 0.21 seconds 109s 50 parallel clients 109s 3 bytes payload 109s keep alive: 1 109s host configuration "save": 3600 1 300 100 60 10000 109s host configuration "appendonly": no 109s multi-thread: no 109s 109s Latency by percentile distribution: 109s 0.000% <= 0.175 milliseconds (cumulative count 10) 109s 50.000% <= 0.527 milliseconds (cumulative count 51760) 109s 75.000% <= 0.583 milliseconds (cumulative count 76360) 109s 87.500% <= 0.655 milliseconds (cumulative count 87850) 109s 93.750% <= 0.743 milliseconds (cumulative count 94080) 109s 96.875% <= 0.863 milliseconds (cumulative count 96990) 109s 98.438% <= 1.079 milliseconds (cumulative count 98470) 109s 99.219% <= 2.079 milliseconds (cumulative count 99220) 109s 99.609% <= 3.751 milliseconds (cumulative count 99640) 109s 99.805% <= 4.199 milliseconds (cumulative count 99810) 109s 99.902% <= 4.255 milliseconds (cumulative count 99920) 109s 99.951% <= 4.279 milliseconds (cumulative count 99960) 109s 99.976% <= 4.287 milliseconds (cumulative count 99990) 109s 99.994% <= 4.295 milliseconds (cumulative count 100000) 109s 100.000% <= 4.295 milliseconds (cumulative count 100000) 109s 109s Cumulative distribution of latencies: 109s 0.000% <= 0.103 milliseconds (cumulative count 0) 109s 0.020% <= 0.207 milliseconds (cumulative count 20) 109s 0.100% <= 0.303 milliseconds (cumulative count 100) 109s 0.480% <= 0.407 milliseconds (cumulative count 480) 109s 30.720% <= 0.503 milliseconds (cumulative count 30720) 109s 81.400% <= 0.607 milliseconds (cumulative count 81400) 109s 92.070% <= 0.703 milliseconds (cumulative count 92070) 109s 95.990% <= 0.807 milliseconds (cumulative count 95990) 109s 97.440% <= 0.903 milliseconds (cumulative count 97440) 109s 98.030% <= 1.007 milliseconds (cumulative count 98030) 109s 98.540% <= 1.103 milliseconds (cumulative count 98540) 109s 98.690% <= 1.207 milliseconds (cumulative count 98690) 109s 98.730% <= 1.303 milliseconds (cumulative count 98730) 109s 98.740% <= 1.407 milliseconds (cumulative count 98740) 109s 98.950% <= 1.703 milliseconds (cumulative count 98950) 109s 99.120% <= 1.807 milliseconds (cumulative count 99120) 109s 99.160% <= 1.903 milliseconds (cumulative count 99160) 109s 99.200% <= 2.007 milliseconds (cumulative count 99200) 109s 99.220% <= 2.103 milliseconds (cumulative count 99220) 109s 99.500% <= 3.103 milliseconds (cumulative count 99500) 109s 99.750% <= 4.103 milliseconds (cumulative count 99750) 109s 100.000% <= 5.103 milliseconds (cumulative count 100000) 109s 109s Summary: 109s throughput summary: 469483.56 requests per second 109s latency summary (msec): 109s avg min p50 p95 p99 max 109s 0.582 0.168 0.527 0.775 1.743 4.295 110s SET: rps=175418.3 (overall: 423365.4) avg_msec=0.970 (overall: 0.970) ====== SET ====== 110s 100000 requests completed in 0.26 seconds 110s 50 parallel clients 110s 3 bytes payload 110s keep alive: 1 110s host configuration "save": 3600 1 300 100 60 10000 110s host configuration "appendonly": no 110s multi-thread: no 110s 110s Latency by percentile distribution: 110s 0.000% <= 0.175 milliseconds (cumulative count 10) 110s 50.000% <= 0.775 milliseconds (cumulative count 50500) 110s 75.000% <= 0.935 milliseconds (cumulative count 75800) 110s 87.500% <= 1.159 milliseconds (cumulative count 87590) 110s 93.750% <= 1.383 milliseconds (cumulative count 93890) 110s 96.875% <= 1.567 milliseconds (cumulative count 96930) 110s 98.438% <= 1.895 milliseconds (cumulative count 98460) 110s 99.219% <= 2.063 milliseconds (cumulative count 99260) 110s 99.609% <= 2.343 milliseconds (cumulative count 99610) 110s 99.805% <= 2.623 milliseconds (cumulative count 99860) 110s 99.902% <= 2.647 milliseconds (cumulative count 99950) 110s 99.951% <= 2.655 milliseconds (cumulative count 99970) 110s 99.976% <= 2.663 milliseconds (cumulative count 99980) 110s 99.988% <= 2.687 milliseconds (cumulative count 99990) 110s 99.994% <= 2.759 milliseconds (cumulative count 100000) 110s 100.000% <= 2.759 milliseconds (cumulative count 100000) 110s 110s Cumulative distribution of latencies: 110s 0.000% <= 0.103 milliseconds (cumulative count 0) 110s 0.030% <= 0.207 milliseconds (cumulative count 30) 110s 0.090% <= 0.303 milliseconds (cumulative count 90) 110s 0.180% <= 0.407 milliseconds (cumulative count 180) 110s 2.230% <= 0.503 milliseconds (cumulative count 2230) 110s 10.050% <= 0.607 milliseconds (cumulative count 10050) 110s 30.660% <= 0.703 milliseconds (cumulative count 30660) 110s 58.910% <= 0.807 milliseconds (cumulative count 58910) 110s 72.620% <= 0.903 milliseconds (cumulative count 72620) 110s 80.630% <= 1.007 milliseconds (cumulative count 80630) 110s 85.590% <= 1.103 milliseconds (cumulative count 85590) 110s 89.210% <= 1.207 milliseconds (cumulative count 89210) 110s 91.950% <= 1.303 milliseconds (cumulative count 91950) 110s 94.620% <= 1.407 milliseconds (cumulative count 94620) 110s 96.450% <= 1.503 milliseconds (cumulative count 96450) 110s 97.240% <= 1.607 milliseconds (cumulative count 97240) 110s 97.580% <= 1.703 milliseconds (cumulative count 97580) 110s 98.050% <= 1.807 milliseconds (cumulative count 98050) 110s 98.520% <= 1.903 milliseconds (cumulative count 98520) 110s 99.050% <= 2.007 milliseconds (cumulative count 99050) 110s 99.390% <= 2.103 milliseconds (cumulative count 99390) 110s 100.000% <= 3.103 milliseconds (cumulative count 100000) 110s 110s Summary: 110s throughput summary: 383141.75 requests per second 110s latency summary (msec): 110s avg min p50 p95 p99 max 110s 0.855 0.168 0.775 1.423 2.007 2.759 110s GET: rps=167320.0 (overall: 470000.0) avg_msec=0.601 (overall: 0.601) ====== GET ====== 110s 100000 requests completed in 0.21 seconds 110s 50 parallel clients 110s 3 bytes payload 110s keep alive: 1 110s host configuration "save": 3600 1 300 100 60 10000 110s host configuration "appendonly": no 110s multi-thread: no 110s 110s Latency by percentile distribution: 110s 0.000% <= 0.239 milliseconds (cumulative count 10) 110s 50.000% <= 0.551 milliseconds (cumulative count 52370) 110s 75.000% <= 0.647 milliseconds (cumulative count 75870) 110s 87.500% <= 0.783 milliseconds (cumulative count 87860) 110s 93.750% <= 0.935 milliseconds (cumulative count 93800) 110s 96.875% <= 1.079 milliseconds (cumulative count 96970) 110s 98.438% <= 1.223 milliseconds (cumulative count 98440) 110s 99.219% <= 1.407 milliseconds (cumulative count 99230) 110s 99.609% <= 1.631 milliseconds (cumulative count 99610) 110s 99.805% <= 2.607 milliseconds (cumulative count 99810) 110s 99.902% <= 2.623 milliseconds (cumulative count 99920) 110s 99.951% <= 2.639 milliseconds (cumulative count 99970) 110s 99.976% <= 2.647 milliseconds (cumulative count 99990) 110s 99.994% <= 2.663 milliseconds (cumulative count 100000) 110s 100.000% <= 2.663 milliseconds (cumulative count 100000) 110s 110s Cumulative distribution of latencies: 110s 0.000% <= 0.103 milliseconds (cumulative count 0) 110s 0.030% <= 0.303 milliseconds (cumulative count 30) 110s 0.380% <= 0.407 milliseconds (cumulative count 380) 110s 25.960% <= 0.503 milliseconds (cumulative count 25960) 110s 69.000% <= 0.607 milliseconds (cumulative count 69000) 110s 82.210% <= 0.703 milliseconds (cumulative count 82210) 110s 89.060% <= 0.807 milliseconds (cumulative count 89060) 110s 92.990% <= 0.903 milliseconds (cumulative count 92990) 110s 95.620% <= 1.007 milliseconds (cumulative count 95620) 110s 97.380% <= 1.103 milliseconds (cumulative count 97380) 110s 98.340% <= 1.207 milliseconds (cumulative count 98340) 110s 98.860% <= 1.303 milliseconds (cumulative count 98860) 110s 99.230% <= 1.407 milliseconds (cumulative count 99230) 110s 99.440% <= 1.503 milliseconds (cumulative count 99440) 110s 99.590% <= 1.607 milliseconds (cumulative count 99590) 110s 99.660% <= 1.703 milliseconds (cumulative count 99660) 110s 99.700% <= 1.807 milliseconds (cumulative count 99700) 110s 99.730% <= 1.903 milliseconds (cumulative count 99730) 110s 100.000% <= 3.103 milliseconds (cumulative count 100000) 110s 110s Summary: 110s throughput summary: 485436.91 requests per second 110s latency summary (msec): 110s avg min p50 p95 p99 max 110s 0.611 0.232 0.551 0.983 1.335 2.663 110s INCR: rps=234143.4 (overall: 448626.0) avg_msec=0.726 (overall: 0.726) ====== INCR ====== 110s 100000 requests completed in 0.22 seconds 110s 50 parallel clients 110s 3 bytes payload 110s keep alive: 1 110s host configuration "save": 3600 1 300 100 60 10000 110s host configuration "appendonly": no 110s multi-thread: no 110s 110s Latency by percentile distribution: 110s 0.000% <= 0.191 milliseconds (cumulative count 10) 110s 50.000% <= 0.599 milliseconds (cumulative count 50520) 110s 75.000% <= 0.759 milliseconds (cumulative count 75740) 110s 87.500% <= 0.887 milliseconds (cumulative count 87700) 110s 93.750% <= 1.015 milliseconds (cumulative count 93840) 110s 96.875% <= 1.151 milliseconds (cumulative count 97000) 110s 98.438% <= 1.279 milliseconds (cumulative count 98470) 110s 99.219% <= 1.519 milliseconds (cumulative count 99220) 110s 99.609% <= 1.935 milliseconds (cumulative count 99610) 110s 99.805% <= 2.207 milliseconds (cumulative count 99810) 110s 99.902% <= 2.343 milliseconds (cumulative count 99910) 110s 99.951% <= 2.383 milliseconds (cumulative count 99960) 110s 99.976% <= 2.455 milliseconds (cumulative count 99980) 110s 99.988% <= 2.551 milliseconds (cumulative count 99990) 110s 99.994% <= 2.623 milliseconds (cumulative count 100000) 110s 100.000% <= 2.623 milliseconds (cumulative count 100000) 110s 110s Cumulative distribution of latencies: 110s 0.000% <= 0.103 milliseconds (cumulative count 0) 110s 0.020% <= 0.207 milliseconds (cumulative count 20) 110s 0.060% <= 0.303 milliseconds (cumulative count 60) 110s 0.380% <= 0.407 milliseconds (cumulative count 380) 110s 17.370% <= 0.503 milliseconds (cumulative count 17370) 110s 52.300% <= 0.607 milliseconds (cumulative count 52300) 110s 68.270% <= 0.703 milliseconds (cumulative count 68270) 110s 81.160% <= 0.807 milliseconds (cumulative count 81160) 110s 88.730% <= 0.903 milliseconds (cumulative count 88730) 110s 93.610% <= 1.007 milliseconds (cumulative count 93610) 110s 96.160% <= 1.103 milliseconds (cumulative count 96160) 110s 97.770% <= 1.207 milliseconds (cumulative count 97770) 110s 98.590% <= 1.303 milliseconds (cumulative count 98590) 110s 98.950% <= 1.407 milliseconds (cumulative count 98950) 110s 99.210% <= 1.503 milliseconds (cumulative count 99210) 110s 99.330% <= 1.607 milliseconds (cumulative count 99330) 110s 99.450% <= 1.703 milliseconds (cumulative count 99450) 110s 99.560% <= 1.807 milliseconds (cumulative count 99560) 110s 99.580% <= 1.903 milliseconds (cumulative count 99580) 110s 99.660% <= 2.007 milliseconds (cumulative count 99660) 110s 99.740% <= 2.103 milliseconds (cumulative count 99740) 110s 100.000% <= 3.103 milliseconds (cumulative count 100000) 110s 110s Summary: 110s throughput summary: 458715.59 requests per second 110s latency summary (msec): 110s avg min p50 p95 p99 max 110s 0.666 0.184 0.599 1.063 1.423 2.623 110s LPUSH: rps=258214.3 (overall: 399202.4) avg_msec=0.708 (overall: 0.708) ====== LPUSH ====== 110s 100000 requests completed in 0.25 seconds 110s 50 parallel clients 110s 3 bytes payload 110s keep alive: 1 110s host configuration "save": 3600 1 300 100 60 10000 110s host configuration "appendonly": no 110s multi-thread: no 110s 110s Latency by percentile distribution: 110s 0.000% <= 0.199 milliseconds (cumulative count 10) 110s 50.000% <= 0.703 milliseconds (cumulative count 50280) 110s 75.000% <= 0.831 milliseconds (cumulative count 75300) 110s 87.500% <= 0.919 milliseconds (cumulative count 88150) 110s 93.750% <= 1.007 milliseconds (cumulative count 93870) 110s 96.875% <= 1.119 milliseconds (cumulative count 96970) 110s 98.438% <= 1.215 milliseconds (cumulative count 98470) 110s 99.219% <= 1.351 milliseconds (cumulative count 99220) 110s 99.609% <= 1.447 milliseconds (cumulative count 99670) 110s 99.805% <= 1.487 milliseconds (cumulative count 99820) 110s 99.902% <= 1.543 milliseconds (cumulative count 99910) 110s 99.951% <= 1.599 milliseconds (cumulative count 99960) 110s 99.976% <= 1.639 milliseconds (cumulative count 99980) 110s 99.988% <= 1.671 milliseconds (cumulative count 99990) 110s 99.994% <= 1.727 milliseconds (cumulative count 100000) 110s 100.000% <= 1.727 milliseconds (cumulative count 100000) 110s 110s Cumulative distribution of latencies: 110s 0.000% <= 0.103 milliseconds (cumulative count 0) 110s 0.020% <= 0.207 milliseconds (cumulative count 20) 110s 0.090% <= 0.303 milliseconds (cumulative count 90) 110s 0.450% <= 0.407 milliseconds (cumulative count 450) 110s 8.320% <= 0.503 milliseconds (cumulative count 8320) 110s 30.750% <= 0.607 milliseconds (cumulative count 30750) 110s 50.280% <= 0.703 milliseconds (cumulative count 50280) 110s 70.420% <= 0.807 milliseconds (cumulative count 70420) 110s 86.550% <= 0.903 milliseconds (cumulative count 86550) 110s 93.870% <= 1.007 milliseconds (cumulative count 93870) 110s 96.590% <= 1.103 milliseconds (cumulative count 96590) 110s 98.390% <= 1.207 milliseconds (cumulative count 98390) 110s 99.030% <= 1.303 milliseconds (cumulative count 99030) 110s 99.400% <= 1.407 milliseconds (cumulative count 99400) 110s 99.850% <= 1.503 milliseconds (cumulative count 99850) 110s 99.960% <= 1.607 milliseconds (cumulative count 99960) 110s 99.990% <= 1.703 milliseconds (cumulative count 99990) 110s 100.000% <= 1.807 milliseconds (cumulative count 100000) 110s 110s Summary: 110s throughput summary: 408163.25 requests per second 110s latency summary (msec): 110s avg min p50 p95 p99 max 110s 0.722 0.192 0.703 1.047 1.303 1.727 111s RPUSH: rps=297370.5 (overall: 449638.6) avg_msec=0.639 (overall: 0.639) ====== RPUSH ====== 111s 100000 requests completed in 0.23 seconds 111s 50 parallel clients 111s 3 bytes payload 111s keep alive: 1 111s host configuration "save": 3600 1 300 100 60 10000 111s host configuration "appendonly": no 111s multi-thread: no 111s 111s Latency by percentile distribution: 111s 0.000% <= 0.247 milliseconds (cumulative count 10) 111s 50.000% <= 0.639 milliseconds (cumulative count 50980) 111s 75.000% <= 0.831 milliseconds (cumulative count 75280) 111s 87.500% <= 1.215 milliseconds (cumulative count 87560) 111s 93.750% <= 1.431 milliseconds (cumulative count 93840) 111s 96.875% <= 1.719 milliseconds (cumulative count 96900) 111s 98.438% <= 1.959 milliseconds (cumulative count 98460) 111s 99.219% <= 2.271 milliseconds (cumulative count 99220) 111s 99.609% <= 2.823 milliseconds (cumulative count 99640) 111s 99.805% <= 3.143 milliseconds (cumulative count 99820) 111s 99.902% <= 3.647 milliseconds (cumulative count 99910) 111s 99.951% <= 3.791 milliseconds (cumulative count 99960) 111s 99.976% <= 3.871 milliseconds (cumulative count 99980) 111s 99.988% <= 3.927 milliseconds (cumulative count 99990) 111s 99.994% <= 4.511 milliseconds (cumulative count 100000) 111s 100.000% <= 4.511 milliseconds (cumulative count 100000) 111s 111s Cumulative distribution of latencies: 111s 0.000% <= 0.103 milliseconds (cumulative count 0) 111s 0.040% <= 0.303 milliseconds (cumulative count 40) 111s 0.270% <= 0.407 milliseconds (cumulative count 270) 111s 13.860% <= 0.503 milliseconds (cumulative count 13860) 111s 44.490% <= 0.607 milliseconds (cumulative count 44490) 111s 61.370% <= 0.703 milliseconds (cumulative count 61370) 111s 73.080% <= 0.807 milliseconds (cumulative count 73080) 111s 79.830% <= 0.903 milliseconds (cumulative count 79830) 111s 83.430% <= 1.007 milliseconds (cumulative count 83430) 111s 85.580% <= 1.103 milliseconds (cumulative count 85580) 111s 87.320% <= 1.207 milliseconds (cumulative count 87320) 111s 89.860% <= 1.303 milliseconds (cumulative count 89860) 111s 93.120% <= 1.407 milliseconds (cumulative count 93120) 111s 95.160% <= 1.503 milliseconds (cumulative count 95160) 111s 96.100% <= 1.607 milliseconds (cumulative count 96100) 111s 96.810% <= 1.703 milliseconds (cumulative count 96810) 111s 97.520% <= 1.807 milliseconds (cumulative count 97520) 111s 98.180% <= 1.903 milliseconds (cumulative count 98180) 111s 98.730% <= 2.007 milliseconds (cumulative count 98730) 111s 99.080% <= 2.103 milliseconds (cumulative count 99080) 111s 99.770% <= 3.103 milliseconds (cumulative count 99770) 111s 99.990% <= 4.103 milliseconds (cumulative count 99990) 111s 100.000% <= 5.103 milliseconds (cumulative count 100000) 111s 111s Summary: 111s throughput summary: 425531.91 requests per second 111s latency summary (msec): 111s avg min p50 p95 p99 max 111s 0.769 0.240 0.639 1.495 2.079 4.511 111s LPOP: rps=336880.0 (overall: 473146.1) avg_msec=0.890 (overall: 0.890) ====== LPOP ====== 111s 100000 requests completed in 0.21 seconds 111s 50 parallel clients 111s 3 bytes payload 111s keep alive: 1 111s host configuration "save": 3600 1 300 100 60 10000 111s host configuration "appendonly": no 111s multi-thread: no 111s 111s Latency by percentile distribution: 111s 0.000% <= 0.215 milliseconds (cumulative count 10) 111s 50.000% <= 0.863 milliseconds (cumulative count 50470) 111s 75.000% <= 1.015 milliseconds (cumulative count 75790) 111s 87.500% <= 1.143 milliseconds (cumulative count 88000) 111s 93.750% <= 1.231 milliseconds (cumulative count 93950) 111s 96.875% <= 1.335 milliseconds (cumulative count 96930) 111s 98.438% <= 1.447 milliseconds (cumulative count 98450) 111s 99.219% <= 1.567 milliseconds (cumulative count 99220) 111s 99.609% <= 1.751 milliseconds (cumulative count 99610) 111s 99.805% <= 1.975 milliseconds (cumulative count 99810) 111s 99.902% <= 2.223 milliseconds (cumulative count 99910) 111s 99.951% <= 2.423 milliseconds (cumulative count 99960) 111s 99.976% <= 2.455 milliseconds (cumulative count 99980) 111s 99.988% <= 2.471 milliseconds (cumulative count 99990) 111s 99.994% <= 2.503 milliseconds (cumulative count 100000) 111s 100.000% <= 2.503 milliseconds (cumulative count 100000) 111s 111s Cumulative distribution of latencies: 111s 0.000% <= 0.103 milliseconds (cumulative count 0) 111s 0.050% <= 0.303 milliseconds (cumulative count 50) 111s 0.370% <= 0.407 milliseconds (cumulative count 370) 111s 3.260% <= 0.503 milliseconds (cumulative count 3260) 111s 10.840% <= 0.607 milliseconds (cumulative count 10840) 111s 20.010% <= 0.703 milliseconds (cumulative count 20010) 111s 36.590% <= 0.807 milliseconds (cumulative count 36590) 111s 59.420% <= 0.903 milliseconds (cumulative count 59420) 111s 74.850% <= 1.007 milliseconds (cumulative count 74850) 111s 84.720% <= 1.103 milliseconds (cumulative count 84720) 111s 92.500% <= 1.207 milliseconds (cumulative count 92500) 111s 96.340% <= 1.303 milliseconds (cumulative count 96340) 111s 97.980% <= 1.407 milliseconds (cumulative count 97980) 111s 98.900% <= 1.503 milliseconds (cumulative count 98900) 111s 99.310% <= 1.607 milliseconds (cumulative count 99310) 111s 99.540% <= 1.703 milliseconds (cumulative count 99540) 111s 99.680% <= 1.807 milliseconds (cumulative count 99680) 111s 99.750% <= 1.903 milliseconds (cumulative count 99750) 111s 99.830% <= 2.007 milliseconds (cumulative count 99830) 111s 99.860% <= 2.103 milliseconds (cumulative count 99860) 111s 100.000% <= 3.103 milliseconds (cumulative count 100000) 111s 111s Summary: 111s throughput summary: 471698.12 requests per second 111s latency summary (msec): 111s avg min p50 p95 p99 max 111s 0.882 0.208 0.863 1.263 1.519 2.503 111s RPOP: rps=368326.7 (overall: 432009.3) avg_msec=0.987 (overall: 0.987) ====== RPOP ====== 111s 100000 requests completed in 0.24 seconds 111s 50 parallel clients 111s 3 bytes payload 111s keep alive: 1 111s host configuration "save": 3600 1 300 100 60 10000 111s host configuration "appendonly": no 111s multi-thread: no 111s 111s Latency by percentile distribution: 111s 0.000% <= 0.367 milliseconds (cumulative count 10) 111s 50.000% <= 0.943 milliseconds (cumulative count 50400) 111s 75.000% <= 1.167 milliseconds (cumulative count 75720) 111s 87.500% <= 1.351 milliseconds (cumulative count 87730) 111s 93.750% <= 1.535 milliseconds (cumulative count 93830) 111s 96.875% <= 1.791 milliseconds (cumulative count 96890) 111s 98.438% <= 1.991 milliseconds (cumulative count 98460) 111s 99.219% <= 2.143 milliseconds (cumulative count 99220) 111s 99.609% <= 2.383 milliseconds (cumulative count 99610) 111s 99.805% <= 2.559 milliseconds (cumulative count 99810) 111s 99.902% <= 2.607 milliseconds (cumulative count 99920) 111s 99.951% <= 2.671 milliseconds (cumulative count 99960) 111s 99.976% <= 2.695 milliseconds (cumulative count 99980) 111s 99.988% <= 2.743 milliseconds (cumulative count 99990) 111s 99.994% <= 2.871 milliseconds (cumulative count 100000) 111s 100.000% <= 2.871 milliseconds (cumulative count 100000) 111s 111s Cumulative distribution of latencies: 111s 0.000% <= 0.103 milliseconds (cumulative count 0) 111s 0.140% <= 0.407 milliseconds (cumulative count 140) 111s 2.100% <= 0.503 milliseconds (cumulative count 2100) 111s 7.400% <= 0.607 milliseconds (cumulative count 7400) 111s 13.790% <= 0.703 milliseconds (cumulative count 13790) 111s 24.930% <= 0.807 milliseconds (cumulative count 24930) 111s 43.180% <= 0.903 milliseconds (cumulative count 43180) 111s 58.960% <= 1.007 milliseconds (cumulative count 58960) 111s 69.700% <= 1.103 milliseconds (cumulative count 69700) 111s 78.620% <= 1.207 milliseconds (cumulative count 78620) 111s 84.890% <= 1.303 milliseconds (cumulative count 84890) 111s 90.240% <= 1.407 milliseconds (cumulative count 90240) 111s 93.230% <= 1.503 milliseconds (cumulative count 93230) 111s 95.110% <= 1.607 milliseconds (cumulative count 95110) 111s 96.110% <= 1.703 milliseconds (cumulative count 96110) 111s 97.010% <= 1.807 milliseconds (cumulative count 97010) 111s 97.800% <= 1.903 milliseconds (cumulative count 97800) 111s 98.540% <= 2.007 milliseconds (cumulative count 98540) 111s 99.070% <= 2.103 milliseconds (cumulative count 99070) 111s 100.000% <= 3.103 milliseconds (cumulative count 100000) 111s 111s Summary: 111s throughput summary: 423728.81 requests per second 111s latency summary (msec): 111s avg min p50 p95 p99 max 111s 1.007 0.360 0.943 1.607 2.087 2.871 111s SADD: rps=345480.0 (overall: 383866.7) avg_msec=1.105 (overall: 1.105) ====== SADD ====== 111s 100000 requests completed in 0.25 seconds 111s 50 parallel clients 111s 3 bytes payload 111s keep alive: 1 111s host configuration "save": 3600 1 300 100 60 10000 111s host configuration "appendonly": no 111s multi-thread: no 111s 111s Latency by percentile distribution: 111s 0.000% <= 0.319 milliseconds (cumulative count 10) 111s 50.000% <= 1.055 milliseconds (cumulative count 50470) 111s 75.000% <= 1.303 milliseconds (cumulative count 75210) 111s 87.500% <= 1.495 milliseconds (cumulative count 87620) 111s 93.750% <= 1.679 milliseconds (cumulative count 93830) 111s 96.875% <= 1.815 milliseconds (cumulative count 97000) 111s 98.438% <= 1.951 milliseconds (cumulative count 98460) 111s 99.219% <= 2.319 milliseconds (cumulative count 99240) 111s 99.609% <= 3.351 milliseconds (cumulative count 99610) 111s 99.805% <= 3.991 milliseconds (cumulative count 99810) 111s 99.902% <= 4.415 milliseconds (cumulative count 99910) 111s 99.951% <= 4.639 milliseconds (cumulative count 99960) 111s 99.976% <= 4.735 milliseconds (cumulative count 99980) 111s 99.988% <= 4.775 milliseconds (cumulative count 99990) 111s 99.994% <= 4.839 milliseconds (cumulative count 100000) 111s 100.000% <= 4.839 milliseconds (cumulative count 100000) 111s 111s Cumulative distribution of latencies: 111s 0.000% <= 0.103 milliseconds (cumulative count 0) 111s 0.240% <= 0.407 milliseconds (cumulative count 240) 111s 3.810% <= 0.503 milliseconds (cumulative count 3810) 111s 13.680% <= 0.607 milliseconds (cumulative count 13680) 111s 22.110% <= 0.703 milliseconds (cumulative count 22110) 111s 30.890% <= 0.807 milliseconds (cumulative count 30890) 111s 39.150% <= 0.903 milliseconds (cumulative count 39150) 111s 46.900% <= 1.007 milliseconds (cumulative count 46900) 111s 54.810% <= 1.103 milliseconds (cumulative count 54810) 111s 65.050% <= 1.207 milliseconds (cumulative count 65050) 111s 75.210% <= 1.303 milliseconds (cumulative count 75210) 111s 83.340% <= 1.407 milliseconds (cumulative count 83340) 111s 87.900% <= 1.503 milliseconds (cumulative count 87900) 111s 91.390% <= 1.607 milliseconds (cumulative count 91390) 111s 94.540% <= 1.703 milliseconds (cumulative count 94540) 111s 96.760% <= 1.807 milliseconds (cumulative count 96760) 111s 98.150% <= 1.903 milliseconds (cumulative count 98150) 111s 98.650% <= 2.007 milliseconds (cumulative count 98650) 111s 98.850% <= 2.103 milliseconds (cumulative count 98850) 111s 99.560% <= 3.103 milliseconds (cumulative count 99560) 111s 99.840% <= 4.103 milliseconds (cumulative count 99840) 111s 100.000% <= 5.103 milliseconds (cumulative count 100000) 111s 111s Summary: 111s throughput summary: 393700.78 requests per second 111s latency summary (msec): 111s avg min p50 p95 p99 max 111s 1.064 0.312 1.055 1.719 2.183 4.839 112s HSET: rps=346852.6 (overall: 397534.2) avg_msec=0.927 (overall: 0.927) ====== HSET ====== 112s 100000 requests completed in 0.26 seconds 112s 50 parallel clients 112s 3 bytes payload 112s keep alive: 1 112s host configuration "save": 3600 1 300 100 60 10000 112s host configuration "appendonly": no 112s multi-thread: no 112s 112s Latency by percentile distribution: 112s 0.000% <= 0.215 milliseconds (cumulative count 10) 112s 50.000% <= 0.831 milliseconds (cumulative count 50980) 112s 75.000% <= 1.031 milliseconds (cumulative count 75010) 112s 87.500% <= 1.279 milliseconds (cumulative count 87710) 112s 93.750% <= 1.495 milliseconds (cumulative count 93750) 112s 96.875% <= 1.871 milliseconds (cumulative count 96900) 112s 98.438% <= 2.359 milliseconds (cumulative count 98440) 112s 99.219% <= 2.775 milliseconds (cumulative count 99250) 112s 99.609% <= 2.911 milliseconds (cumulative count 99630) 112s 99.805% <= 3.119 milliseconds (cumulative count 99810) 112s 99.902% <= 3.263 milliseconds (cumulative count 99910) 112s 99.951% <= 3.415 milliseconds (cumulative count 99960) 112s 99.976% <= 3.471 milliseconds (cumulative count 99980) 112s 99.988% <= 3.495 milliseconds (cumulative count 99990) 112s 99.994% <= 3.535 milliseconds (cumulative count 100000) 112s 100.000% <= 3.535 milliseconds (cumulative count 100000) 112s 112s Cumulative distribution of latencies: 112s 0.000% <= 0.103 milliseconds (cumulative count 0) 112s 0.040% <= 0.303 milliseconds (cumulative count 40) 112s 0.140% <= 0.407 milliseconds (cumulative count 140) 112s 0.710% <= 0.503 milliseconds (cumulative count 710) 112s 2.810% <= 0.607 milliseconds (cumulative count 2810) 112s 12.230% <= 0.703 milliseconds (cumulative count 12230) 112s 46.000% <= 0.807 milliseconds (cumulative count 46000) 112s 62.580% <= 0.903 milliseconds (cumulative count 62580) 112s 73.370% <= 1.007 milliseconds (cumulative count 73370) 112s 79.530% <= 1.103 milliseconds (cumulative count 79530) 112s 85.140% <= 1.207 milliseconds (cumulative count 85140) 112s 88.640% <= 1.303 milliseconds (cumulative count 88640) 112s 92.080% <= 1.407 milliseconds (cumulative count 92080) 112s 93.910% <= 1.503 milliseconds (cumulative count 93910) 112s 95.090% <= 1.607 milliseconds (cumulative count 95090) 112s 95.930% <= 1.703 milliseconds (cumulative count 95930) 112s 96.560% <= 1.807 milliseconds (cumulative count 96560) 112s 97.050% <= 1.903 milliseconds (cumulative count 97050) 112s 97.560% <= 2.007 milliseconds (cumulative count 97560) 112s 97.870% <= 2.103 milliseconds (cumulative count 97870) 112s 99.800% <= 3.103 milliseconds (cumulative count 99800) 112s 100.000% <= 4.103 milliseconds (cumulative count 100000) 112s 112s Summary: 112s throughput summary: 384615.41 requests per second 112s latency summary (msec): 112s avg min p50 p95 p99 max 112s 0.949 0.208 0.831 1.599 2.671 3.535 112s SPOP: rps=384581.7 (overall: 468592.2) avg_msec=0.622 (overall: 0.622) ====== SPOP ====== 112s 100000 requests completed in 0.21 seconds 112s 50 parallel clients 112s 3 bytes payload 112s keep alive: 1 112s host configuration "save": 3600 1 300 100 60 10000 112s host configuration "appendonly": no 112s multi-thread: no 112s 112s Latency by percentile distribution: 112s 0.000% <= 0.175 milliseconds (cumulative count 10) 112s 50.000% <= 0.551 milliseconds (cumulative count 51210) 112s 75.000% <= 0.631 milliseconds (cumulative count 75610) 112s 87.500% <= 0.727 milliseconds (cumulative count 87630) 112s 93.750% <= 0.967 milliseconds (cumulative count 93850) 112s 96.875% <= 1.215 milliseconds (cumulative count 96940) 112s 98.438% <= 1.455 milliseconds (cumulative count 98480) 112s 99.219% <= 1.775 milliseconds (cumulative count 99240) 112s 99.609% <= 3.463 milliseconds (cumulative count 99610) 112s 99.805% <= 4.303 milliseconds (cumulative count 99810) 112s 99.902% <= 4.511 milliseconds (cumulative count 99910) 112s 99.951% <= 4.615 milliseconds (cumulative count 99960) 112s 99.976% <= 4.671 milliseconds (cumulative count 99980) 112s 99.988% <= 4.687 milliseconds (cumulative count 99990) 112s 99.994% <= 4.719 milliseconds (cumulative count 100000) 112s 100.000% <= 4.719 milliseconds (cumulative count 100000) 112s 112s Cumulative distribution of latencies: 112s 0.000% <= 0.103 milliseconds (cumulative count 0) 112s 0.020% <= 0.207 milliseconds (cumulative count 20) 112s 0.060% <= 0.303 milliseconds (cumulative count 60) 112s 0.500% <= 0.407 milliseconds (cumulative count 500) 112s 27.360% <= 0.503 milliseconds (cumulative count 27360) 112s 69.300% <= 0.607 milliseconds (cumulative count 69300) 112s 86.260% <= 0.703 milliseconds (cumulative count 86260) 112s 90.250% <= 0.807 milliseconds (cumulative count 90250) 112s 92.580% <= 0.903 milliseconds (cumulative count 92580) 112s 94.550% <= 1.007 milliseconds (cumulative count 94550) 112s 95.860% <= 1.103 milliseconds (cumulative count 95860) 112s 96.810% <= 1.207 milliseconds (cumulative count 96810) 112s 97.620% <= 1.303 milliseconds (cumulative count 97620) 112s 98.180% <= 1.407 milliseconds (cumulative count 98180) 112s 98.780% <= 1.503 milliseconds (cumulative count 98780) 112s 99.030% <= 1.607 milliseconds (cumulative count 99030) 112s 99.160% <= 1.703 milliseconds (cumulative count 99160) 112s 99.260% <= 1.807 milliseconds (cumulative count 99260) 112s 99.360% <= 1.903 milliseconds (cumulative count 99360) 112s 99.410% <= 2.007 milliseconds (cumulative count 99410) 112s 99.450% <= 2.103 milliseconds (cumulative count 99450) 112s 99.520% <= 3.103 milliseconds (cumulative count 99520) 112s 99.710% <= 4.103 milliseconds (cumulative count 99710) 112s 100.000% <= 5.103 milliseconds (cumulative count 100000) 112s 112s Summary: 112s throughput summary: 467289.72 requests per second 112s latency summary (msec): 112s avg min p50 p95 p99 max 112s 0.623 0.168 0.551 1.031 1.599 4.719 112s ====== ZADD ====== 112s 100000 requests completed in 0.22 seconds 112s 50 parallel clients 112s 3 bytes payload 112s keep alive: 1 112s host configuration "save": 3600 1 300 100 60 10000 112s host configuration "appendonly": no 112s multi-thread: no 112s 112s Latency by percentile distribution: 112s 0.000% <= 0.359 milliseconds (cumulative count 10) 112s 50.000% <= 0.863 milliseconds (cumulative count 50810) 112s 75.000% <= 1.039 milliseconds (cumulative count 75640) 112s 87.500% <= 1.191 milliseconds (cumulative count 87660) 112s 93.750% <= 1.343 milliseconds (cumulative count 93750) 112s 96.875% <= 1.495 milliseconds (cumulative count 96950) 112s 98.438% <= 1.711 milliseconds (cumulative count 98480) 112s 99.219% <= 1.935 milliseconds (cumulative count 99220) 112s 99.609% <= 2.679 milliseconds (cumulative count 99630) 112s 99.805% <= 2.823 milliseconds (cumulative count 99810) 112s 99.902% <= 2.967 milliseconds (cumulative count 99910) 112s 99.951% <= 3.071 milliseconds (cumulative count 99960) 112s 99.976% <= 3.127 milliseconds (cumulative count 99980) 112s 99.988% <= 3.239 milliseconds (cumulative count 99990) 112s 99.994% <= 3.271 milliseconds (cumulative count 100000) 112s 100.000% <= 3.271 milliseconds (cumulative count 100000) 112s 112s Cumulative distribution of latencies: 112s 0.000% <= 0.103 milliseconds (cumulative count 0) 112s 0.200% <= 0.407 milliseconds (cumulative count 200) 112s 3.520% <= 0.503 milliseconds (cumulative count 3520) 112s 11.490% <= 0.607 milliseconds (cumulative count 11490) 112s 21.430% <= 0.703 milliseconds (cumulative count 21430) 112s 38.690% <= 0.807 milliseconds (cumulative count 38690) 112s 58.490% <= 0.903 milliseconds (cumulative count 58490) 112s 72.410% <= 1.007 milliseconds (cumulative count 72410) 112s 81.430% <= 1.103 milliseconds (cumulative count 81430) 112s 88.630% <= 1.207 milliseconds (cumulative count 88630) 112s 92.590% <= 1.303 milliseconds (cumulative count 92590) 112s 95.180% <= 1.407 milliseconds (cumulative count 95180) 112s 96.990% <= 1.503 milliseconds (cumulative count 96990) 112s 97.930% <= 1.607 milliseconds (cumulative count 97930) 112s 98.410% <= 1.703 milliseconds (cumulative count 98410) 112s 98.870% <= 1.807 milliseconds (cumulative count 98870) 112s 99.160% <= 1.903 milliseconds (cumulative count 99160) 112s 99.360% <= 2.007 milliseconds (cumulative count 99360) 112s 99.450% <= 2.103 milliseconds (cumulative count 99450) 112s 99.970% <= 3.103 milliseconds (cumulative count 99970) 112s 100.000% <= 4.103 milliseconds (cumulative count 100000) 112s 112s Summary: 112s throughput summary: 444444.47 requests per second 112s latency summary (msec): 112s avg min p50 p95 p99 max 112s 0.905 0.352 0.863 1.407 1.855 3.271 112s ZPOPMIN: rps=20555.6 (overall: 398461.5) avg_msec=0.852 (overall: 0.852) ====== ZPOPMIN ====== 112s 100000 requests completed in 0.21 seconds 112s 50 parallel clients 112s 3 bytes payload 112s keep alive: 1 112s host configuration "save": 3600 1 300 100 60 10000 112s host configuration "appendonly": no 112s multi-thread: no 112s 112s Latency by percentile distribution: 112s 0.000% <= 0.175 milliseconds (cumulative count 10) 112s 50.000% <= 0.559 milliseconds (cumulative count 50180) 112s 75.000% <= 0.671 milliseconds (cumulative count 75180) 112s 87.500% <= 0.783 milliseconds (cumulative count 88050) 112s 93.750% <= 0.895 milliseconds (cumulative count 93850) 112s 96.875% <= 1.015 milliseconds (cumulative count 96920) 112s 98.438% <= 1.143 milliseconds (cumulative count 98470) 112s 99.219% <= 1.255 milliseconds (cumulative count 99220) 112s 99.609% <= 1.343 milliseconds (cumulative count 99610) 112s 99.805% <= 1.487 milliseconds (cumulative count 99820) 112s 99.902% <= 1.615 milliseconds (cumulative count 99910) 112s 99.951% <= 1.719 milliseconds (cumulative count 99960) 112s 99.976% <= 1.743 milliseconds (cumulative count 99980) 112s 99.988% <= 1.919 milliseconds (cumulative count 99990) 112s 99.994% <= 1.983 milliseconds (cumulative count 100000) 112s 100.000% <= 1.983 milliseconds (cumulative count 100000) 112s 112s Cumulative distribution of latencies: 112s 0.000% <= 0.103 milliseconds (cumulative count 0) 112s 0.010% <= 0.207 milliseconds (cumulative count 10) 112s 0.090% <= 0.303 milliseconds (cumulative count 90) 112s 0.570% <= 0.407 milliseconds (cumulative count 570) 112s 28.260% <= 0.503 milliseconds (cumulative count 28260) 112s 62.450% <= 0.607 milliseconds (cumulative count 62450) 112s 80.020% <= 0.703 milliseconds (cumulative count 80020) 112s 89.920% <= 0.807 milliseconds (cumulative count 89920) 112s 94.200% <= 0.903 milliseconds (cumulative count 94200) 112s 96.780% <= 1.007 milliseconds (cumulative count 96780) 112s 97.990% <= 1.103 milliseconds (cumulative count 97990) 112s 98.990% <= 1.207 milliseconds (cumulative count 98990) 112s 99.470% <= 1.303 milliseconds (cumulative count 99470) 112s 99.710% <= 1.407 milliseconds (cumulative count 99710) 112s 99.830% <= 1.503 milliseconds (cumulative count 99830) 112s 99.890% <= 1.607 milliseconds (cumulative count 99890) 112s 99.950% <= 1.703 milliseconds (cumulative count 99950) 112s 99.980% <= 1.807 milliseconds (cumulative count 99980) 112s 100.000% <= 2.007 milliseconds (cumulative count 100000) 112s 112s Summary: 112s throughput summary: 471698.12 requests per second 112s latency summary (msec): 112s avg min p50 p95 p99 max 112s 0.609 0.168 0.559 0.927 1.215 1.983 112s LPUSH (needed to benchmark LRANGE): rps=90400.0 (overall: 470833.3) avg_msec=0.846 (overall: 0.846) ====== LPUSH (needed to benchmark LRANGE) ====== 112s 100000 requests completed in 0.21 seconds 112s 50 parallel clients 112s 3 bytes payload 112s keep alive: 1 112s host configuration "save": 3600 1 300 100 60 10000 112s host configuration "appendonly": no 112s multi-thread: no 112s 112s Latency by percentile distribution: 112s 0.000% <= 0.327 milliseconds (cumulative count 10) 112s 50.000% <= 0.807 milliseconds (cumulative count 50390) 112s 75.000% <= 0.967 milliseconds (cumulative count 75290) 112s 87.500% <= 1.111 milliseconds (cumulative count 88050) 112s 93.750% <= 1.207 milliseconds (cumulative count 94020) 112s 96.875% <= 1.311 milliseconds (cumulative count 96900) 112s 98.438% <= 1.399 milliseconds (cumulative count 98440) 112s 99.219% <= 1.495 milliseconds (cumulative count 99230) 112s 99.609% <= 1.599 milliseconds (cumulative count 99620) 112s 99.805% <= 1.679 milliseconds (cumulative count 99810) 112s 99.902% <= 1.743 milliseconds (cumulative count 99910) 112s 99.951% <= 1.943 milliseconds (cumulative count 99960) 112s 99.976% <= 1.983 milliseconds (cumulative count 99980) 112s 99.988% <= 1.999 milliseconds (cumulative count 99990) 112s 99.994% <= 2.079 milliseconds (cumulative count 100000) 112s 100.000% <= 2.079 milliseconds (cumulative count 100000) 112s 112s Cumulative distribution of latencies: 112s 0.000% <= 0.103 milliseconds (cumulative count 0) 112s 0.520% <= 0.407 milliseconds (cumulative count 520) 112s 6.020% <= 0.503 milliseconds (cumulative count 6020) 112s 17.950% <= 0.607 milliseconds (cumulative count 17950) 112s 31.700% <= 0.703 milliseconds (cumulative count 31700) 112s 50.390% <= 0.807 milliseconds (cumulative count 50390) 112s 67.240% <= 0.903 milliseconds (cumulative count 67240) 112s 79.680% <= 1.007 milliseconds (cumulative count 79680) 112s 87.450% <= 1.103 milliseconds (cumulative count 87450) 112s 94.020% <= 1.207 milliseconds (cumulative count 94020) 112s 96.710% <= 1.303 milliseconds (cumulative count 96710) 112s 98.490% <= 1.407 milliseconds (cumulative count 98490) 112s 99.270% <= 1.503 milliseconds (cumulative count 99270) 112s 99.640% <= 1.607 milliseconds (cumulative count 99640) 112s 99.850% <= 1.703 milliseconds (cumulative count 99850) 112s 99.930% <= 1.807 milliseconds (cumulative count 99930) 112s 99.990% <= 2.007 milliseconds (cumulative count 99990) 112s 100.000% <= 2.103 milliseconds (cumulative count 100000) 112s 112s Summary: 112s throughput summary: 480769.22 requests per second 112s latency summary (msec): 112s avg min p50 p95 p99 max 112s 0.828 0.320 0.807 1.239 1.463 2.079 114s LRANGE_100 (first 100 elements): rps=28492.1 (overall: 80674.2) avg_msec=3.261 (overall: 3.261) LRANGE_100 (first 100 elements): rps=86825.4 (overall: 85219.9) avg_msec=2.883 (overall: 2.977) LRANGE_100 (first 100 elements): rps=89322.7 (overall: 86959.5) avg_msec=3.053 (overall: 3.010) LRANGE_100 (first 100 elements): rps=89681.3 (overall: 87769.9) avg_msec=3.006 (overall: 3.008) LRANGE_100 (first 100 elements): rps=98167.3 (overall: 90155.4) avg_msec=2.572 (overall: 2.899) ====== LRANGE_100 (first 100 elements) ====== 114s 100000 requests completed in 1.11 seconds 114s 50 parallel clients 114s 3 bytes payload 114s keep alive: 1 114s host configuration "save": 3600 1 300 100 60 10000 114s host configuration "appendonly": no 114s multi-thread: no 114s 114s Latency by percentile distribution: 114s 0.000% <= 0.679 milliseconds (cumulative count 10) 114s 50.000% <= 2.703 milliseconds (cumulative count 50850) 114s 75.000% <= 2.967 milliseconds (cumulative count 75430) 114s 87.500% <= 3.319 milliseconds (cumulative count 87540) 114s 93.750% <= 3.847 milliseconds (cumulative count 93790) 114s 96.875% <= 4.831 milliseconds (cumulative count 96890) 114s 98.438% <= 6.047 milliseconds (cumulative count 98440) 114s 99.219% <= 7.311 milliseconds (cumulative count 99220) 114s 99.609% <= 8.727 milliseconds (cumulative count 99630) 114s 99.805% <= 13.711 milliseconds (cumulative count 99810) 114s 99.902% <= 13.999 milliseconds (cumulative count 99910) 114s 99.951% <= 14.071 milliseconds (cumulative count 99970) 114s 99.976% <= 14.087 milliseconds (cumulative count 99990) 114s 99.994% <= 14.095 milliseconds (cumulative count 100000) 114s 100.000% <= 14.095 milliseconds (cumulative count 100000) 114s 114s Cumulative distribution of latencies: 114s 0.000% <= 0.103 milliseconds (cumulative count 0) 114s 0.010% <= 0.703 milliseconds (cumulative count 10) 114s 0.020% <= 0.807 milliseconds (cumulative count 20) 114s 0.030% <= 1.007 milliseconds (cumulative count 30) 114s 0.040% <= 1.207 milliseconds (cumulative count 40) 114s 0.050% <= 1.303 milliseconds (cumulative count 50) 114s 0.060% <= 1.407 milliseconds (cumulative count 60) 114s 0.090% <= 1.503 milliseconds (cumulative count 90) 114s 0.130% <= 1.607 milliseconds (cumulative count 130) 114s 0.170% <= 1.703 milliseconds (cumulative count 170) 114s 0.240% <= 1.807 milliseconds (cumulative count 240) 114s 0.390% <= 1.903 milliseconds (cumulative count 390) 114s 0.660% <= 2.007 milliseconds (cumulative count 660) 114s 1.190% <= 2.103 milliseconds (cumulative count 1190) 114s 82.440% <= 3.103 milliseconds (cumulative count 82440) 114s 95.060% <= 4.103 milliseconds (cumulative count 95060) 114s 97.370% <= 5.103 milliseconds (cumulative count 97370) 114s 98.480% <= 6.103 milliseconds (cumulative count 98480) 114s 99.100% <= 7.103 milliseconds (cumulative count 99100) 114s 99.490% <= 8.103 milliseconds (cumulative count 99490) 114s 99.730% <= 9.103 milliseconds (cumulative count 99730) 114s 99.740% <= 10.103 milliseconds (cumulative count 99740) 114s 99.750% <= 12.103 milliseconds (cumulative count 99750) 114s 100.000% <= 14.103 milliseconds (cumulative count 100000) 114s 114s Summary: 114s throughput summary: 90252.70 requests per second 114s latency summary (msec): 114s avg min p50 p95 p99 max 114s 2.896 0.672 2.703 4.087 6.887 14.095 117s LRANGE_300 (first 300 elements): rps=26199.2 (overall: 28102.6) avg_msec=8.742 (overall: 8.742) LRANGE_300 (first 300 elements): rps=27130.4 (overall: 27597.5) avg_msec=9.877 (overall: 9.322) LRANGE_300 (first 300 elements): rps=29808.0 (overall: 28347.4) avg_msec=9.015 (overall: 9.213) LRANGE_300 (first 300 elements): rps=30801.6 (overall: 28972.7) avg_msec=8.384 (overall: 8.988) LRANGE_300 (first 300 elements): rps=29024.0 (overall: 28983.1) avg_msec=8.904 (overall: 8.971) LRANGE_300 (first 300 elements): rps=28968.7 (overall: 28980.6) avg_msec=8.622 (overall: 8.911) LRANGE_300 (first 300 elements): rps=25187.3 (overall: 28435.3) avg_msec=11.837 (overall: 9.284) LRANGE_300 (first 300 elements): rps=26474.3 (overall: 28187.1) avg_msec=9.453 (overall: 9.304) LRANGE_300 (first 300 elements): rps=29247.1 (overall: 28307.0) avg_msec=8.429 (overall: 9.202) LRANGE_300 (first 300 elements): rps=24764.9 (overall: 27952.1) avg_msec=11.215 (overall: 9.380) LRANGE_300 (first 300 elements): rps=27834.0 (overall: 27941.3) avg_msec=9.281 (overall: 9.371) LRANGE_300 (first 300 elements): rps=25525.7 (overall: 27738.3) avg_msec=10.998 (overall: 9.497) LRANGE_300 (first 300 elements): rps=28293.7 (overall: 27781.2) avg_msec=9.485 (overall: 9.496) LRANGE_300 (first 300 elements): rps=26865.1 (overall: 27715.5) avg_msec=10.129 (overall: 9.540) ====== LRANGE_300 (first 300 elements) ====== 117s 100000 requests completed in 3.60 seconds 117s 50 parallel clients 117s 3 bytes payload 117s keep alive: 1 117s host configuration "save": 3600 1 300 100 60 10000 117s host configuration "appendonly": no 117s multi-thread: no 117s 117s Latency by percentile distribution: 117s 0.000% <= 0.583 milliseconds (cumulative count 10) 117s 50.000% <= 8.815 milliseconds (cumulative count 50090) 117s 75.000% <= 11.271 milliseconds (cumulative count 75000) 117s 87.500% <= 13.871 milliseconds (cumulative count 87520) 117s 93.750% <= 16.031 milliseconds (cumulative count 93770) 117s 96.875% <= 18.255 milliseconds (cumulative count 96880) 117s 98.438% <= 20.671 milliseconds (cumulative count 98440) 117s 99.219% <= 22.671 milliseconds (cumulative count 99230) 117s 99.609% <= 23.599 milliseconds (cumulative count 99610) 117s 99.805% <= 24.383 milliseconds (cumulative count 99820) 117s 99.902% <= 24.799 milliseconds (cumulative count 99910) 117s 99.951% <= 25.135 milliseconds (cumulative count 99960) 117s 99.976% <= 25.327 milliseconds (cumulative count 99980) 117s 99.988% <= 25.519 milliseconds (cumulative count 99990) 117s 99.994% <= 25.727 milliseconds (cumulative count 100000) 117s 100.000% <= 25.727 milliseconds (cumulative count 100000) 117s 117s Cumulative distribution of latencies: 117s 0.000% <= 0.103 milliseconds (cumulative count 0) 117s 0.010% <= 0.607 milliseconds (cumulative count 10) 117s 0.030% <= 0.903 milliseconds (cumulative count 30) 117s 0.040% <= 1.007 milliseconds (cumulative count 40) 117s 0.050% <= 1.103 milliseconds (cumulative count 50) 117s 0.110% <= 1.207 milliseconds (cumulative count 110) 117s 0.190% <= 1.303 milliseconds (cumulative count 190) 117s 0.260% <= 1.407 milliseconds (cumulative count 260) 117s 0.390% <= 1.503 milliseconds (cumulative count 390) 117s 0.500% <= 1.607 milliseconds (cumulative count 500) 117s 0.600% <= 1.703 milliseconds (cumulative count 600) 117s 0.740% <= 1.807 milliseconds (cumulative count 740) 117s 0.800% <= 1.903 milliseconds (cumulative count 800) 117s 0.900% <= 2.007 milliseconds (cumulative count 900) 117s 0.950% <= 2.103 milliseconds (cumulative count 950) 117s 1.660% <= 3.103 milliseconds (cumulative count 1660) 117s 3.140% <= 4.103 milliseconds (cumulative count 3140) 117s 6.570% <= 5.103 milliseconds (cumulative count 6570) 117s 13.730% <= 6.103 milliseconds (cumulative count 13730) 117s 25.470% <= 7.103 milliseconds (cumulative count 25470) 117s 39.780% <= 8.103 milliseconds (cumulative count 39780) 117s 54.080% <= 9.103 milliseconds (cumulative count 54080) 117s 65.870% <= 10.103 milliseconds (cumulative count 65870) 117s 73.870% <= 11.103 milliseconds (cumulative count 73870) 117s 79.560% <= 12.103 milliseconds (cumulative count 79560) 117s 84.270% <= 13.103 milliseconds (cumulative count 84270) 117s 88.400% <= 14.103 milliseconds (cumulative count 88400) 117s 91.650% <= 15.103 milliseconds (cumulative count 91650) 117s 93.910% <= 16.103 milliseconds (cumulative count 93910) 117s 95.670% <= 17.103 milliseconds (cumulative count 95670) 117s 96.750% <= 18.111 milliseconds (cumulative count 96750) 117s 97.380% <= 19.103 milliseconds (cumulative count 97380) 117s 98.130% <= 20.111 milliseconds (cumulative count 98130) 117s 98.560% <= 21.103 milliseconds (cumulative count 98560) 117s 99.000% <= 22.111 milliseconds (cumulative count 99000) 117s 99.410% <= 23.103 milliseconds (cumulative count 99410) 117s 99.740% <= 24.111 milliseconds (cumulative count 99740) 117s 99.950% <= 25.103 milliseconds (cumulative count 99950) 117s 100.000% <= 26.111 milliseconds (cumulative count 100000) 117s 117s Summary: 117s throughput summary: 27754.65 requests per second 117s latency summary (msec): 117s avg min p50 p95 p99 max 117s 9.528 0.576 8.815 16.655 22.111 25.727 124s LRANGE_500 (first 500 elements): rps=7083.0 (overall: 10993.9) avg_msec=19.490 (overall: 19.490) LRANGE_500 (first 500 elements): rps=13867.2 (overall: 12749.4) avg_msec=21.666 (overall: 20.936) LRANGE_500 (first 500 elements): rps=15209.5 (overall: 13675.6) avg_msec=15.078 (overall: 18.483) LRANGE_500 (first 500 elements): rps=14570.9 (overall: 13921.2) avg_msec=16.045 (overall: 17.783) LRANGE_500 (first 500 elements): rps=14328.1 (overall: 14009.3) avg_msec=14.717 (overall: 17.104) LRANGE_500 (first 500 elements): rps=15731.2 (overall: 14312.9) avg_msec=15.607 (overall: 16.814) LRANGE_500 (first 500 elements): rps=17244.1 (overall: 14753.7) avg_msec=12.140 (overall: 15.992) LRANGE_500 (first 500 elements): rps=16424.6 (overall: 14970.6) avg_msec=12.489 (overall: 15.493) LRANGE_500 (first 500 elements): rps=15604.7 (overall: 15043.8) avg_msec=15.623 (overall: 15.509) LRANGE_500 (first 500 elements): rps=17245.1 (overall: 15271.4) avg_msec=11.892 (overall: 15.087) LRANGE_500 (first 500 elements): rps=17568.0 (overall: 15484.2) avg_msec=11.422 (overall: 14.701) LRANGE_500 (first 500 elements): rps=13276.7 (overall: 15294.9) avg_msec=20.755 (overall: 15.152) LRANGE_500 (first 500 elements): rps=11597.6 (overall: 15005.0) avg_msec=25.231 (overall: 15.763) LRANGE_500 (first 500 elements): rps=11510.0 (overall: 14750.9) avg_msec=23.224 (overall: 16.186) LRANGE_500 (first 500 elements): rps=13177.2 (overall: 14643.0) avg_msec=20.116 (overall: 16.428) LRANGE_500 (first 500 elements): rps=13079.7 (overall: 14543.8) avg_msec=19.386 (overall: 16.597) LRANGE_500 (first 500 elements): rps=15144.0 (overall: 14579.5) avg_msec=17.293 (overall: 16.640) LRANGE_500 (first 500 elements): rps=15916.7 (overall: 14655.1) avg_msec=16.026 (overall: 16.602) LRANGE_500 (first 500 elements): rps=14075.7 (overall: 14624.2) avg_msec=18.101 (overall: 16.679) LRANGE_500 (first 500 elements): rps=14120.0 (overall: 14598.8) avg_msec=17.768 (overall: 16.732) LRANGE_500 (first 500 elements): rps=14916.3 (overall: 14614.1) avg_msec=16.603 (overall: 16.726) LRANGE_500 (first 500 elements): rps=15804.7 (overall: 14669.8) avg_msec=14.462 (overall: 16.612) LRANGE_500 (first 500 elements): rps=16482.2 (overall: 14750.0) avg_msec=12.634 (overall: 16.415) LRANGE_500 (first 500 elements): rps=16730.2 (overall: 14833.6) avg_msec=13.257 (overall: 16.265) LRANGE_500 (first 500 elements): rps=15976.4 (overall: 14880.2) avg_msec=16.049 (overall: 16.255) LRANGE_500 (first 500 elements): rps=16368.6 (overall: 14938.7) avg_msec=12.329 (overall: 16.086) ====== LRANGE_500 (first 500 elements) ====== 124s 100000 requests completed in 6.69 seconds 124s 50 parallel clients 124s 3 bytes payload 124s keep alive: 1 124s host configuration "save": 3600 1 300 100 60 10000 124s host configuration "appendonly": no 124s multi-thread: no 124s 124s Latency by percentile distribution: 124s 0.000% <= 0.535 milliseconds (cumulative count 10) 124s 50.000% <= 14.439 milliseconds (cumulative count 50070) 124s 75.000% <= 20.223 milliseconds (cumulative count 75000) 124s 87.500% <= 24.927 milliseconds (cumulative count 87510) 124s 93.750% <= 28.847 milliseconds (cumulative count 93780) 124s 96.875% <= 31.583 milliseconds (cumulative count 96890) 124s 98.438% <= 33.791 milliseconds (cumulative count 98450) 124s 99.219% <= 35.743 milliseconds (cumulative count 99220) 124s 99.609% <= 37.695 milliseconds (cumulative count 99610) 124s 99.805% <= 39.903 milliseconds (cumulative count 99810) 124s 99.902% <= 40.959 milliseconds (cumulative count 99910) 124s 99.951% <= 45.279 milliseconds (cumulative count 99960) 124s 99.976% <= 45.663 milliseconds (cumulative count 99980) 124s 99.988% <= 49.055 milliseconds (cumulative count 99990) 124s 99.994% <= 49.247 milliseconds (cumulative count 100000) 124s 100.000% <= 49.247 milliseconds (cumulative count 100000) 124s 124s Cumulative distribution of latencies: 124s 0.000% <= 0.103 milliseconds (cumulative count 0) 124s 0.010% <= 0.607 milliseconds (cumulative count 10) 124s 0.020% <= 1.207 milliseconds (cumulative count 20) 124s 0.050% <= 1.407 milliseconds (cumulative count 50) 124s 0.110% <= 1.503 milliseconds (cumulative count 110) 124s 0.120% <= 1.607 milliseconds (cumulative count 120) 124s 0.180% <= 1.703 milliseconds (cumulative count 180) 124s 0.270% <= 1.807 milliseconds (cumulative count 270) 124s 0.310% <= 1.903 milliseconds (cumulative count 310) 124s 0.400% <= 2.007 milliseconds (cumulative count 400) 124s 0.430% <= 2.103 milliseconds (cumulative count 430) 124s 1.120% <= 3.103 milliseconds (cumulative count 1120) 124s 1.910% <= 4.103 milliseconds (cumulative count 1910) 124s 2.650% <= 5.103 milliseconds (cumulative count 2650) 124s 4.000% <= 6.103 milliseconds (cumulative count 4000) 124s 5.260% <= 7.103 milliseconds (cumulative count 5260) 124s 7.940% <= 8.103 milliseconds (cumulative count 7940) 124s 12.410% <= 9.103 milliseconds (cumulative count 12410) 124s 18.370% <= 10.103 milliseconds (cumulative count 18370) 124s 25.100% <= 11.103 milliseconds (cumulative count 25100) 124s 32.260% <= 12.103 milliseconds (cumulative count 32260) 124s 40.470% <= 13.103 milliseconds (cumulative count 40470) 124s 47.930% <= 14.103 milliseconds (cumulative count 47930) 124s 53.830% <= 15.103 milliseconds (cumulative count 53830) 124s 59.040% <= 16.103 milliseconds (cumulative count 59040) 124s 63.700% <= 17.103 milliseconds (cumulative count 63700) 124s 67.930% <= 18.111 milliseconds (cumulative count 67930) 124s 71.370% <= 19.103 milliseconds (cumulative count 71370) 124s 74.590% <= 20.111 milliseconds (cumulative count 74590) 124s 78.020% <= 21.103 milliseconds (cumulative count 78020) 124s 80.830% <= 22.111 milliseconds (cumulative count 80830) 124s 83.490% <= 23.103 milliseconds (cumulative count 83490) 124s 85.700% <= 24.111 milliseconds (cumulative count 85700) 124s 87.900% <= 25.103 milliseconds (cumulative count 87900) 124s 90.060% <= 26.111 milliseconds (cumulative count 90060) 124s 91.790% <= 27.103 milliseconds (cumulative count 91790) 124s 92.980% <= 28.111 milliseconds (cumulative count 92980) 124s 94.130% <= 29.103 milliseconds (cumulative count 94130) 124s 95.360% <= 30.111 milliseconds (cumulative count 95360) 124s 96.460% <= 31.103 milliseconds (cumulative count 96460) 124s 97.270% <= 32.111 milliseconds (cumulative count 97270) 124s 98.020% <= 33.119 milliseconds (cumulative count 98020) 124s 98.610% <= 34.111 milliseconds (cumulative count 98610) 124s 99.040% <= 35.103 milliseconds (cumulative count 99040) 124s 99.310% <= 36.127 milliseconds (cumulative count 99310) 124s 99.500% <= 37.119 milliseconds (cumulative count 99500) 124s 99.630% <= 38.111 milliseconds (cumulative count 99630) 124s 99.720% <= 39.103 milliseconds (cumulative count 99720) 124s 99.830% <= 40.127 milliseconds (cumulative count 99830) 124s 99.920% <= 41.119 milliseconds (cumulative count 99920) 124s 99.950% <= 42.111 milliseconds (cumulative count 99950) 124s 99.980% <= 46.111 milliseconds (cumulative count 99980) 124s 99.990% <= 49.119 milliseconds (cumulative count 99990) 124s 100.000% <= 50.111 milliseconds (cumulative count 100000) 124s 124s Summary: 124s throughput summary: 14956.63 requests per second 124s latency summary (msec): 124s avg min p50 p95 p99 max 124s 16.031 0.528 14.439 29.855 35.007 49.247 133s LRANGE_600 (first 600 elements): rps=1772.5 (overall: 9416.7) avg_msec=24.625 (overall: 24.625) LRANGE_600 (first 600 elements): rps=10180.8 (overall: 10061.7) avg_msec=24.179 (overall: 24.244) LRANGE_600 (first 600 elements): rps=9810.8 (overall: 9947.1) avg_msec=24.470 (overall: 24.346) LRANGE_600 (first 600 elements): rps=10201.6 (overall: 10025.6) avg_msec=23.810 (overall: 24.178) LRANGE_600 (first 600 elements): rps=12814.2 (overall: 10683.1) avg_msec=18.617 (overall: 22.605) LRANGE_600 (first 600 elements): rps=11882.8 (overall: 10914.2) avg_msec=23.238 (overall: 22.738) LRANGE_600 (first 600 elements): rps=11616.3 (overall: 11028.4) avg_msec=21.385 (overall: 22.506) LRANGE_600 (first 600 elements): rps=10537.3 (overall: 10960.4) avg_msec=22.928 (overall: 22.562) LRANGE_600 (first 600 elements): rps=12102.0 (overall: 11099.2) avg_msec=22.309 (overall: 22.529) LRANGE_600 (first 600 elements): rps=12426.3 (overall: 11241.1) avg_msec=20.216 (overall: 22.255) LRANGE_600 (first 600 elements): rps=11179.3 (overall: 11235.1) avg_msec=22.415 (overall: 22.271) LRANGE_600 (first 600 elements): rps=10680.0 (overall: 11186.4) avg_msec=23.501 (overall: 22.374) LRANGE_600 (first 600 elements): rps=12984.1 (overall: 11332.5) avg_msec=18.666 (overall: 22.029) LRANGE_600 (first 600 elements): rps=13897.2 (overall: 11525.9) avg_msec=14.752 (overall: 21.367) LRANGE_600 (first 600 elements): rps=10665.3 (overall: 11466.0) avg_msec=23.592 (overall: 21.511) LRANGE_600 (first 600 elements): rps=10303.1 (overall: 11389.5) avg_msec=23.336 (overall: 21.620) LRANGE_600 (first 600 elements): rps=9632.0 (overall: 11282.5) avg_msec=25.669 (overall: 21.830) LRANGE_600 (first 600 elements): rps=10565.2 (overall: 11240.9) avg_msec=25.154 (overall: 22.011) LRANGE_600 (first 600 elements): rps=9509.8 (overall: 11145.3) avg_msec=25.384 (overall: 22.170) LRANGE_600 (first 600 elements): rps=10210.1 (overall: 11096.0) avg_msec=24.282 (overall: 22.273) LRANGE_600 (first 600 elements): rps=9553.8 (overall: 11020.5) avg_msec=25.247 (overall: 22.399) LRANGE_600 (first 600 elements): rps=10191.4 (overall: 10981.0) avg_msec=25.215 (overall: 22.523) LRANGE_600 (first 600 elements): rps=9455.3 (overall: 10911.5) avg_msec=25.373 (overall: 22.636) LRANGE_600 (first 600 elements): rps=9823.5 (overall: 10864.4) avg_msec=26.023 (overall: 22.768) LRANGE_600 (first 600 elements): rps=12647.1 (overall: 10938.4) avg_msec=19.498 (overall: 22.611) LRANGE_600 (first 600 elements): rps=13118.6 (overall: 11024.5) avg_msec=19.375 (overall: 22.459) LRANGE_600 (first 600 elements): rps=10916.7 (overall: 11020.4) avg_msec=20.881 (overall: 22.400) LRANGE_600 (first 600 elements): rps=9980.5 (overall: 10981.9) avg_msec=29.296 (overall: 22.632) LRANGE_600 (first 600 elements): rps=9619.6 (overall: 10933.4) avg_msec=26.271 (overall: 22.746) LRANGE_600 (first 600 elements): rps=9892.4 (overall: 10898.2) avg_msec=24.936 (overall: 22.814) LRANGE_600 (first 600 elements): rps=14035.7 (overall: 11001.3) avg_msec=15.076 (overall: 22.489) LRANGE_600 (first 600 elements): rps=13749.0 (overall: 11088.4) avg_msec=16.847 (overall: 22.267) LRANGE_600 (first 600 elements): rps=12239.0 (overall: 11123.8) avg_msec=17.969 (overall: 22.122) LRANGE_600 (first 600 elements): rps=10043.3 (overall: 11091.2) avg_msec=25.165 (overall: 22.205) LRANGE_600 (first 600 elements): rps=12853.2 (overall: 11142.4) avg_msec=17.732 (overall: 22.055) ====== LRANGE_600 (first 600 elements) ====== 133s 100000 requests completed in 8.90 seconds 133s 50 parallel clients 133s 3 bytes payload 133s keep alive: 1 133s host configuration "save": 3600 1 300 100 60 10000 133s host configuration "appendonly": no 133s multi-thread: no 133s 133s Latency by percentile distribution: 133s 0.000% <= 0.567 milliseconds (cumulative count 10) 133s 50.000% <= 22.703 milliseconds (cumulative count 50000) 133s 75.000% <= 27.615 milliseconds (cumulative count 75050) 133s 87.500% <= 32.111 milliseconds (cumulative count 87510) 133s 93.750% <= 34.975 milliseconds (cumulative count 93810) 133s 96.875% <= 36.671 milliseconds (cumulative count 96910) 133s 98.438% <= 37.759 milliseconds (cumulative count 98440) 133s 99.219% <= 38.527 milliseconds (cumulative count 99250) 133s 99.609% <= 39.103 milliseconds (cumulative count 99610) 133s 99.805% <= 39.743 milliseconds (cumulative count 99810) 133s 99.902% <= 40.351 milliseconds (cumulative count 99910) 133s 99.951% <= 41.215 milliseconds (cumulative count 99960) 133s 99.976% <= 41.663 milliseconds (cumulative count 99980) 133s 99.988% <= 41.759 milliseconds (cumulative count 99990) 133s 99.994% <= 41.951 milliseconds (cumulative count 100000) 133s 100.000% <= 41.951 milliseconds (cumulative count 100000) 133s 133s Cumulative distribution of latencies: 133s 0.000% <= 0.103 milliseconds (cumulative count 0) 133s 0.010% <= 0.607 milliseconds (cumulative count 10) 133s 0.020% <= 1.103 milliseconds (cumulative count 20) 133s 0.050% <= 1.207 milliseconds (cumulative count 50) 133s 0.100% <= 1.303 milliseconds (cumulative count 100) 133s 0.150% <= 1.407 milliseconds (cumulative count 150) 133s 0.250% <= 1.503 milliseconds (cumulative count 250) 133s 0.330% <= 1.607 milliseconds (cumulative count 330) 133s 0.620% <= 1.703 milliseconds (cumulative count 620) 133s 0.880% <= 1.807 milliseconds (cumulative count 880) 133s 1.210% <= 1.903 milliseconds (cumulative count 1210) 133s 1.490% <= 2.007 milliseconds (cumulative count 1490) 133s 1.660% <= 2.103 milliseconds (cumulative count 1660) 133s 3.410% <= 3.103 milliseconds (cumulative count 3410) 133s 3.860% <= 4.103 milliseconds (cumulative count 3860) 133s 4.090% <= 5.103 milliseconds (cumulative count 4090) 133s 4.600% <= 6.103 milliseconds (cumulative count 4600) 133s 5.470% <= 7.103 milliseconds (cumulative count 5470) 133s 6.360% <= 8.103 milliseconds (cumulative count 6360) 133s 7.780% <= 9.103 milliseconds (cumulative count 7780) 133s 9.720% <= 10.103 milliseconds (cumulative count 9720) 133s 12.200% <= 11.103 milliseconds (cumulative count 12200) 133s 15.270% <= 12.103 milliseconds (cumulative count 15270) 133s 18.360% <= 13.103 milliseconds (cumulative count 18360) 133s 21.330% <= 14.103 milliseconds (cumulative count 21330) 133s 23.950% <= 15.103 milliseconds (cumulative count 23950) 133s 26.300% <= 16.103 milliseconds (cumulative count 26300) 133s 28.580% <= 17.103 milliseconds (cumulative count 28580) 133s 31.120% <= 18.111 milliseconds (cumulative count 31120) 133s 34.090% <= 19.103 milliseconds (cumulative count 34090) 133s 37.580% <= 20.111 milliseconds (cumulative count 37580) 133s 41.650% <= 21.103 milliseconds (cumulative count 41650) 133s 46.870% <= 22.111 milliseconds (cumulative count 46870) 133s 52.110% <= 23.103 milliseconds (cumulative count 52110) 133s 57.300% <= 24.111 milliseconds (cumulative count 57300) 133s 62.500% <= 25.103 milliseconds (cumulative count 62500) 133s 67.700% <= 26.111 milliseconds (cumulative count 67700) 133s 72.600% <= 27.103 milliseconds (cumulative count 72600) 133s 77.090% <= 28.111 milliseconds (cumulative count 77090) 133s 80.790% <= 29.103 milliseconds (cumulative count 80790) 133s 83.430% <= 30.111 milliseconds (cumulative count 83430) 133s 85.570% <= 31.103 milliseconds (cumulative count 85570) 133s 87.510% <= 32.111 milliseconds (cumulative count 87510) 133s 89.830% <= 33.119 milliseconds (cumulative count 89830) 133s 92.050% <= 34.111 milliseconds (cumulative count 92050) 133s 94.110% <= 35.103 milliseconds (cumulative count 94110) 133s 95.970% <= 36.127 milliseconds (cumulative count 95970) 133s 97.600% <= 37.119 milliseconds (cumulative count 97600) 133s 98.860% <= 38.111 milliseconds (cumulative count 98860) 133s 99.610% <= 39.103 milliseconds (cumulative count 99610) 133s 99.880% <= 40.127 milliseconds (cumulative count 99880) 133s 99.950% <= 41.119 milliseconds (cumulative count 99950) 133s 100.000% <= 42.111 milliseconds (cumulative count 100000) 133s 133s Summary: 133s throughput summary: 11229.65 requests per second 133s latency summary (msec): 133s avg min p50 p95 p99 max 133s 21.818 0.560 22.703 35.551 38.239 41.951 133s MSET (10 keys): rps=14023.9 (overall: 195555.6) avg_msec=2.107 (overall: 2.107) MSET (10 keys): rps=208645.4 (overall: 207769.5) avg_msec=2.257 (overall: 2.247) ====== MSET (10 keys) ====== 133s 100000 requests completed in 0.47 seconds 133s 50 parallel clients 133s 3 bytes payload 133s keep alive: 1 133s host configuration "save": 3600 1 300 100 60 10000 133s host configuration "appendonly": no 133s multi-thread: no 133s 133s Latency by percentile distribution: 133s 0.000% <= 0.647 milliseconds (cumulative count 10) 133s 50.000% <= 2.199 milliseconds (cumulative count 50340) 133s 75.000% <= 2.495 milliseconds (cumulative count 75120) 133s 87.500% <= 2.783 milliseconds (cumulative count 87670) 133s 93.750% <= 3.071 milliseconds (cumulative count 93860) 133s 96.875% <= 3.319 milliseconds (cumulative count 96920) 133s 98.438% <= 3.535 milliseconds (cumulative count 98450) 133s 99.219% <= 3.711 milliseconds (cumulative count 99220) 133s 99.609% <= 3.863 milliseconds (cumulative count 99620) 133s 99.805% <= 4.103 milliseconds (cumulative count 99810) 133s 99.902% <= 4.295 milliseconds (cumulative count 99910) 133s 99.951% <= 4.431 milliseconds (cumulative count 99960) 133s 99.976% <= 4.479 milliseconds (cumulative count 99980) 133s 99.988% <= 4.535 milliseconds (cumulative count 99990) 133s 99.994% <= 4.559 milliseconds (cumulative count 100000) 133s 100.000% <= 4.559 milliseconds (cumulative count 100000) 133s 133s Cumulative distribution of latencies: 133s 0.000% <= 0.103 milliseconds (cumulative count 0) 133s 0.060% <= 0.703 milliseconds (cumulative count 60) 133s 0.160% <= 0.807 milliseconds (cumulative count 160) 133s 0.210% <= 0.903 milliseconds (cumulative count 210) 133s 0.260% <= 1.007 milliseconds (cumulative count 260) 133s 0.560% <= 1.103 milliseconds (cumulative count 560) 133s 1.650% <= 1.207 milliseconds (cumulative count 1650) 133s 3.680% <= 1.303 milliseconds (cumulative count 3680) 133s 8.000% <= 1.407 milliseconds (cumulative count 8000) 133s 11.450% <= 1.503 milliseconds (cumulative count 11450) 133s 14.840% <= 1.607 milliseconds (cumulative count 14840) 133s 18.000% <= 1.703 milliseconds (cumulative count 18000) 133s 21.440% <= 1.807 milliseconds (cumulative count 21440) 133s 25.950% <= 1.903 milliseconds (cumulative count 25950) 133s 32.810% <= 2.007 milliseconds (cumulative count 32810) 133s 41.220% <= 2.103 milliseconds (cumulative count 41220) 133s 94.360% <= 3.103 milliseconds (cumulative count 94360) 133s 99.810% <= 4.103 milliseconds (cumulative count 99810) 133s 100.000% <= 5.103 milliseconds (cumulative count 100000) 133s 133s Summary: 133s throughput summary: 212765.95 requests per second 133s latency summary (msec): 133s avg min p50 p95 p99 max 133s 2.202 0.640 2.199 3.151 3.647 4.559 133s XADD: rps=81240.0 (overall: 451333.3) avg_msec=0.977 (overall: 0.977) ====== XADD ====== 133s 100000 requests completed in 0.22 seconds 133s 50 parallel clients 133s 3 bytes payload 133s keep alive: 1 133s host configuration "save": 3600 1 300 100 60 10000 133s host configuration "appendonly": no 133s multi-thread: no 133s 133s Latency by percentile distribution: 133s 0.000% <= 0.311 milliseconds (cumulative count 10) 133s 50.000% <= 0.935 milliseconds (cumulative count 51200) 133s 75.000% <= 1.095 milliseconds (cumulative count 75590) 133s 87.500% <= 1.199 milliseconds (cumulative count 88060) 133s 93.750% <= 1.271 milliseconds (cumulative count 94000) 133s 96.875% <= 1.351 milliseconds (cumulative count 97090) 133s 98.438% <= 1.423 milliseconds (cumulative count 98480) 133s 99.219% <= 1.511 milliseconds (cumulative count 99230) 133s 99.609% <= 1.615 milliseconds (cumulative count 99620) 133s 99.805% <= 1.711 milliseconds (cumulative count 99810) 133s 99.902% <= 1.799 milliseconds (cumulative count 99910) 133s 99.951% <= 1.935 milliseconds (cumulative count 99960) 133s 99.976% <= 1.983 milliseconds (cumulative count 99980) 133s 99.988% <= 2.007 milliseconds (cumulative count 99990) 133s 99.994% <= 2.023 milliseconds (cumulative count 100000) 133s 100.000% <= 2.023 milliseconds (cumulative count 100000) 133s 133s Cumulative distribution of latencies: 133s 0.000% <= 0.103 milliseconds (cumulative count 0) 133s 0.120% <= 0.407 milliseconds (cumulative count 120) 133s 0.580% <= 0.503 milliseconds (cumulative count 580) 133s 1.910% <= 0.607 milliseconds (cumulative count 1910) 133s 7.180% <= 0.703 milliseconds (cumulative count 7180) 133s 26.690% <= 0.807 milliseconds (cumulative count 26690) 133s 45.710% <= 0.903 milliseconds (cumulative count 45710) 133s 63.400% <= 1.007 milliseconds (cumulative count 63400) 133s 76.530% <= 1.103 milliseconds (cumulative count 76530) 133s 88.960% <= 1.207 milliseconds (cumulative count 88960) 133s 95.610% <= 1.303 milliseconds (cumulative count 95610) 133s 98.210% <= 1.407 milliseconds (cumulative count 98210) 133s 99.160% <= 1.503 milliseconds (cumulative count 99160) 133s 99.600% <= 1.607 milliseconds (cumulative count 99600) 133s 99.790% <= 1.703 milliseconds (cumulative count 99790) 133s 99.910% <= 1.807 milliseconds (cumulative count 99910) 133s 99.940% <= 1.903 milliseconds (cumulative count 99940) 133s 99.990% <= 2.007 milliseconds (cumulative count 99990) 133s 100.000% <= 2.103 milliseconds (cumulative count 100000) 133s 133s Summary: 133s throughput summary: 462962.94 requests per second 133s latency summary (msec): 133s avg min p50 p95 p99 max 133s 0.953 0.304 0.935 1.295 1.479 2.023 137s FUNCTION LOAD: rps=8095.2 (overall: 25822.8) avg_msec=17.030 (overall: 17.030) FUNCTION LOAD: rps=27320.0 (overall: 26960.5) avg_msec=18.303 (overall: 18.010) FUNCTION LOAD: rps=26071.4 (overall: 26574.9) avg_msec=18.641 (overall: 18.279) FUNCTION LOAD: rps=26414.3 (overall: 26526.4) avg_msec=18.798 (overall: 18.435) FUNCTION LOAD: rps=26175.3 (overall: 26445.1) avg_msec=19.069 (overall: 18.580) FUNCTION LOAD: rps=25476.2 (overall: 26262.2) avg_msec=18.926 (overall: 18.643) FUNCTION LOAD: rps=24280.0 (overall: 25949.5) avg_msec=20.562 (overall: 18.927) FUNCTION LOAD: rps=25298.8 (overall: 25860.6) avg_msec=19.687 (overall: 19.028) FUNCTION LOAD: rps=26812.8 (overall: 25975.1) avg_msec=18.819 (overall: 19.002) FUNCTION LOAD: rps=24621.5 (overall: 25829.8) avg_msec=20.204 (overall: 19.125) FUNCTION LOAD: rps=26200.0 (overall: 25865.5) avg_msec=18.736 (overall: 19.087) FUNCTION LOAD: rps=25476.2 (overall: 25831.0) avg_msec=19.204 (overall: 19.098) FUNCTION LOAD: rps=24302.8 (overall: 25706.9) avg_msec=20.013 (overall: 19.168) FUNCTION LOAD: rps=26613.5 (overall: 25775.0) avg_msec=18.837 (overall: 19.142) FUNCTION LOAD: rps=25360.0 (overall: 25746.1) avg_msec=19.232 (overall: 19.148) FUNCTION LOAD: rps=27251.0 (overall: 25844.4) avg_msec=18.827 (overall: 19.126) ====== FUNCTION LOAD ====== 137s 100000 requests completed in 3.87 seconds 137s 50 parallel clients 137s 3 bytes payload 137s keep alive: 1 137s host configuration "save": 3600 1 300 100 60 10000 137s host configuration "appendonly": no 137s multi-thread: no 137s 137s Latency by percentile distribution: 137s 0.000% <= 0.623 milliseconds (cumulative count 10) 137s 50.000% <= 20.479 milliseconds (cumulative count 50410) 137s 75.000% <= 21.439 milliseconds (cumulative count 75210) 137s 87.500% <= 22.159 milliseconds (cumulative count 87630) 137s 93.750% <= 22.943 milliseconds (cumulative count 93830) 137s 96.875% <= 24.351 milliseconds (cumulative count 96880) 137s 98.438% <= 28.127 milliseconds (cumulative count 98440) 137s 99.219% <= 32.703 milliseconds (cumulative count 99230) 137s 99.609% <= 33.375 milliseconds (cumulative count 99610) 137s 99.805% <= 33.855 milliseconds (cumulative count 99810) 137s 99.902% <= 34.559 milliseconds (cumulative count 99910) 137s 99.951% <= 34.783 milliseconds (cumulative count 99960) 137s 99.976% <= 34.847 milliseconds (cumulative count 99980) 137s 99.988% <= 34.879 milliseconds (cumulative count 99990) 137s 99.994% <= 35.167 milliseconds (cumulative count 100000) 137s 100.000% <= 35.167 milliseconds (cumulative count 100000) 137s 137s Cumulative distribution of latencies: 137s 0.000% <= 0.103 milliseconds (cumulative count 0) 137s 0.010% <= 0.703 milliseconds (cumulative count 10) 137s 0.120% <= 7.103 milliseconds (cumulative count 120) 137s 0.310% <= 8.103 milliseconds (cumulative count 310) 137s 0.870% <= 9.103 milliseconds (cumulative count 870) 137s 2.870% <= 10.103 milliseconds (cumulative count 2870) 137s 9.620% <= 11.103 milliseconds (cumulative count 9620) 137s 16.390% <= 12.103 milliseconds (cumulative count 16390) 137s 17.850% <= 13.103 milliseconds (cumulative count 17850) 137s 18.190% <= 14.103 milliseconds (cumulative count 18190) 137s 18.480% <= 15.103 milliseconds (cumulative count 18480) 137s 19.050% <= 16.103 milliseconds (cumulative count 19050) 137s 19.660% <= 17.103 milliseconds (cumulative count 19660) 137s 21.000% <= 18.111 milliseconds (cumulative count 21000) 137s 28.610% <= 19.103 milliseconds (cumulative count 28610) 137s 43.160% <= 20.111 milliseconds (cumulative count 43160) 137s 67.070% <= 21.103 milliseconds (cumulative count 67070) 137s 86.960% <= 22.111 milliseconds (cumulative count 86960) 137s 94.710% <= 23.103 milliseconds (cumulative count 94710) 137s 96.670% <= 24.111 milliseconds (cumulative count 96670) 137s 97.110% <= 25.103 milliseconds (cumulative count 97110) 137s 97.590% <= 26.111 milliseconds (cumulative count 97590) 137s 98.290% <= 27.103 milliseconds (cumulative count 98290) 137s 98.430% <= 28.111 milliseconds (cumulative count 98430) 137s 98.720% <= 29.103 milliseconds (cumulative count 98720) 137s 98.840% <= 30.111 milliseconds (cumulative count 98840) 137s 99.030% <= 32.111 milliseconds (cumulative count 99030) 137s 99.420% <= 33.119 milliseconds (cumulative count 99420) 137s 99.820% <= 34.111 milliseconds (cumulative count 99820) 137s 99.990% <= 35.103 milliseconds (cumulative count 99990) 137s 100.000% <= 36.127 milliseconds (cumulative count 100000) 137s 137s Summary: 137s throughput summary: 25859.84 requests per second 137s latency summary (msec): 137s avg min p50 p95 p99 max 137s 19.125 0.616 20.479 23.183 31.727 35.167 137s ====== FCALL ====== 137s 100000 requests completed in 0.21 seconds 137s 50 parallel clients 137s 3 bytes payload 137s keep alive: 1 137s host configuration "save": 3600 1 300 100 60 10000 137s host configuration "appendonly": no 137s multi-thread: no 137s 137s Latency by percentile distribution: 137s 0.000% <= 0.343 milliseconds (cumulative count 10) 137s 50.000% <= 0.919 milliseconds (cumulative count 50490) 137s 75.000% <= 1.087 milliseconds (cumulative count 75470) 137s 87.500% <= 1.215 milliseconds (cumulative count 87950) 137s 93.750% <= 1.335 milliseconds (cumulative count 93880) 137s 96.875% <= 1.455 milliseconds (cumulative count 96970) 137s 98.438% <= 1.575 milliseconds (cumulative count 98550) 137s 99.219% <= 1.663 milliseconds (cumulative count 99260) 137s 99.609% <= 1.735 milliseconds (cumulative count 99620) 137s 99.805% <= 1.807 milliseconds (cumulative count 99840) 137s 99.902% <= 1.903 milliseconds (cumulative count 99920) 137s 99.951% <= 1.975 milliseconds (cumulative count 99960) 137s 99.976% <= 2.031 milliseconds (cumulative count 99980) 137s 99.988% <= 2.095 milliseconds (cumulative count 99990) 137s 99.994% <= 2.103 milliseconds (cumulative count 100000) 137s 100.000% <= 2.103 milliseconds (cumulative count 100000) 137s 137s Cumulative distribution of latencies: 137s 0.000% <= 0.103 milliseconds (cumulative count 0) 137s 0.060% <= 0.407 milliseconds (cumulative count 60) 137s 0.380% <= 0.503 milliseconds (cumulative count 380) 137s 1.230% <= 0.607 milliseconds (cumulative count 1230) 137s 11.140% <= 0.703 milliseconds (cumulative count 11140) 137s 29.310% <= 0.807 milliseconds (cumulative count 29310) 137s 47.540% <= 0.903 milliseconds (cumulative count 47540) 137s 64.310% <= 1.007 milliseconds (cumulative count 64310) 137s 77.530% <= 1.103 milliseconds (cumulative count 77530) 137s 87.310% <= 1.207 milliseconds (cumulative count 87310) 137s 92.810% <= 1.303 milliseconds (cumulative count 92810) 137s 96.010% <= 1.407 milliseconds (cumulative count 96010) 137s 97.680% <= 1.503 milliseconds (cumulative count 97680) 137s 98.890% <= 1.607 milliseconds (cumulative count 98890) 137s 99.470% <= 1.703 milliseconds (cumulative count 99470) 137s 99.840% <= 1.807 milliseconds (cumulative count 99840) 137s 99.920% <= 1.903 milliseconds (cumulative count 99920) 137s 99.960% <= 2.007 milliseconds (cumulative count 99960) 137s 100.000% <= 2.103 milliseconds (cumulative count 100000) 137s 137s Summary: 137s throughput summary: 473933.66 requests per second 137s latency summary (msec): 137s avg min p50 p95 p99 max 137s 0.953 0.336 0.919 1.375 1.631 2.103 137s 138s autopkgtest [21:04:05]: test 0002-benchmark: -----------------------] 138s 0002-benchmark PASS 138s autopkgtest [21:04:05]: test 0002-benchmark: - - - - - - - - - - results - - - - - - - - - - 139s autopkgtest [21:04:06]: test 0003-valkey-check-aof: preparing testbed 139s Reading package lists... 139s Building dependency tree... 139s Reading state information... 139s Solving dependencies... 139s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 140s autopkgtest [21:04:07]: test 0003-valkey-check-aof: [----------------------- 141s autopkgtest [21:04:08]: test 0003-valkey-check-aof: -----------------------] 141s autopkgtest [21:04:08]: test 0003-valkey-check-aof: - - - - - - - - - - results - - - - - - - - - - 141s 0003-valkey-check-aof PASS 142s autopkgtest [21:04:09]: test 0004-valkey-check-rdb: preparing testbed 142s Reading package lists... 142s Building dependency tree... 142s Reading state information... 142s Solving dependencies... 142s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 143s autopkgtest [21:04:10]: test 0004-valkey-check-rdb: [----------------------- 149s OK 149s [offset 0] Checking RDB file /var/lib/valkey/dump.rdb 149s [offset 27] AUX FIELD valkey-ver = '8.1.3' 149s [offset 41] AUX FIELD redis-bits = '64' 149s [offset 53] AUX FIELD ctime = '1753650256' 149s [offset 68] AUX FIELD used-mem = '2995848' 149s [offset 80] AUX FIELD aof-base = '0' 149s [offset 191] Selecting DB ID 0 149s [offset 565585] Checksum OK 149s [offset 565585] \o/ RDB looks OK! \o/ 149s [info] 5 keys read 149s [info] 0 expires 149s [info] 0 already expired 149s autopkgtest [21:04:16]: test 0004-valkey-check-rdb: -----------------------] 149s 0004-valkey-check-rdb PASSautopkgtest [21:04:16]: test 0004-valkey-check-rdb: - - - - - - - - - - results - - - - - - - - - - 149s 150s autopkgtest [21:04:17]: test 0005-cjson: preparing testbed 150s Reading package lists... 150s Building dependency tree... 150s Reading state information... 151s Solving dependencies... 151s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 151s autopkgtest [21:04:18]: test 0005-cjson: [----------------------- 157s 157s autopkgtest [21:04:24]: test 0005-cjson: -----------------------] 158s autopkgtest [21:04:25]: test 0005-cjson: - - - - - - - - - - results - - - - - - - - - - 158s 0005-cjson PASS 158s autopkgtest [21:04:25]: test 0006-migrate-from-redis: preparing testbed 177s Creating nova instance adt-questing-amd64-valkey-20250727-195506-juju-7f2275-prod-proposed-migration-environment-23-1ed870bd-c645-41fe-9c20-47649be53bf2 from image adt/ubuntu-questing-amd64-server-20250727.img (UUID 19a66749-1393-4666-8e85-1bb5b7c6ee26)... 219s autopkgtest [21:05:26]: testbed dpkg architecture: amd64 220s autopkgtest [21:05:27]: testbed apt version: 3.1.3 220s autopkgtest [21:05:27]: @@@@@@@@@@@@@@@@@@@@ test bed setup 220s autopkgtest [21:05:27]: testbed release detected to be: questing 221s autopkgtest [21:05:28]: updating testbed package index (apt update) 221s Get:1 http://ftpmaster.internal/ubuntu questing-proposed InRelease [249 kB] 222s Hit:2 http://ftpmaster.internal/ubuntu questing InRelease 222s Hit:3 http://ftpmaster.internal/ubuntu questing-updates InRelease 222s Hit:4 http://ftpmaster.internal/ubuntu questing-security InRelease 222s Get:5 http://ftpmaster.internal/ubuntu questing-proposed/multiverse Sources [13.0 kB] 222s Get:6 http://ftpmaster.internal/ubuntu questing-proposed/universe Sources [149 kB] 222s Get:7 http://ftpmaster.internal/ubuntu questing-proposed/main Sources [30.7 kB] 222s Get:8 http://ftpmaster.internal/ubuntu questing-proposed/main amd64 Packages [52.3 kB] 222s Get:9 http://ftpmaster.internal/ubuntu questing-proposed/main i386 Packages [42.1 kB] 222s Get:10 http://ftpmaster.internal/ubuntu questing-proposed/universe i386 Packages [80.3 kB] 222s Get:11 http://ftpmaster.internal/ubuntu questing-proposed/universe amd64 Packages [138 kB] 222s Get:12 http://ftpmaster.internal/ubuntu questing-proposed/multiverse i386 Packages [8000 B] 222s Get:13 http://ftpmaster.internal/ubuntu questing-proposed/multiverse amd64 Packages [4696 B] 222s Fetched 768 kB in 1s (883 kB/s) 223s Reading package lists... 224s autopkgtest [21:05:31]: upgrading testbed (apt dist-upgrade and autopurge) 224s Reading package lists... 224s Building dependency tree... 224s Reading state information... 224s Calculating upgrade... 224s The following packages will be upgraded: 224s iputils-ping iputils-tracepath libpam-modules libpam-modules-bin 224s libpam-runtime libpam0g libxml2-16 rsync usb.ids 225s 9 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 225s Need to get 1793 kB of archives. 225s After this operation, 140 kB of additional disk space will be used. 225s Get:1 http://ftpmaster.internal/ubuntu questing/main amd64 libpam0g amd64 1.7.0-5ubuntu1 [69.5 kB] 225s Get:2 http://ftpmaster.internal/ubuntu questing/main amd64 libpam-modules-bin amd64 1.7.0-5ubuntu1 [45.6 kB] 225s Get:3 http://ftpmaster.internal/ubuntu questing/main amd64 libpam-modules amd64 1.7.0-5ubuntu1 [192 kB] 225s Get:4 http://ftpmaster.internal/ubuntu questing/main amd64 rsync amd64 3.4.1+ds1-5 [445 kB] 225s Get:5 http://ftpmaster.internal/ubuntu questing/main amd64 libpam-runtime all 1.7.0-5ubuntu1 [149 kB] 225s Get:6 http://ftpmaster.internal/ubuntu questing/main amd64 iputils-ping amd64 3:20240905-3ubuntu2 [46.4 kB] 225s Get:7 http://ftpmaster.internal/ubuntu questing/main amd64 libxml2-16 amd64 2.14.5+dfsg-0exp1 [607 kB] 225s Get:8 http://ftpmaster.internal/ubuntu questing/main amd64 iputils-tracepath amd64 3:20240905-3ubuntu2 [14.5 kB] 225s Get:9 http://ftpmaster.internal/ubuntu questing/main amd64 usb.ids all 2025.07.26-1 [224 kB] 225s Preconfiguring packages ... 226s Fetched 1793 kB in 1s (2577 kB/s) 226s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81855 files and directories currently installed.) 226s Preparing to unpack .../libpam0g_1.7.0-5ubuntu1_amd64.deb ... 226s Unpacking libpam0g:amd64 (1.7.0-5ubuntu1) over (1.5.3-7ubuntu6) ... 226s Setting up libpam0g:amd64 (1.7.0-5ubuntu1) ... 226s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81855 files and directories currently installed.) 226s Preparing to unpack .../libpam-modules-bin_1.7.0-5ubuntu1_amd64.deb ... 226s Unpacking libpam-modules-bin (1.7.0-5ubuntu1) over (1.5.3-7ubuntu6) ... 226s Setting up libpam-modules-bin (1.7.0-5ubuntu1) ... 226s pam_namespace.service is a disabled or a static unit not running, not starting it. 226s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81847 files and directories currently installed.) 226s Preparing to unpack .../libpam-modules_1.7.0-5ubuntu1_amd64.deb ... 227s Unpacking libpam-modules:amd64 (1.7.0-5ubuntu1) over (1.5.3-7ubuntu6) ... 227s Setting up libpam-modules:amd64 (1.7.0-5ubuntu1) ... 227s Installing new version of config file /etc/security/access.conf ... 227s Installing new version of config file /etc/security/pwhistory.conf ... 227s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81794 files and directories currently installed.) 227s Preparing to unpack .../rsync_3.4.1+ds1-5_amd64.deb ... 227s Unpacking rsync (3.4.1+ds1-5) over (3.4.1+ds1-4) ... 227s Preparing to unpack .../libpam-runtime_1.7.0-5ubuntu1_all.deb ... 227s Unpacking libpam-runtime (1.7.0-5ubuntu1) over (1.5.3-7ubuntu6) ... 227s Setting up libpam-runtime (1.7.0-5ubuntu1) ... 227s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81857 files and directories currently installed.) 227s Preparing to unpack .../iputils-ping_3%3a20240905-3ubuntu2_amd64.deb ... 227s Unpacking iputils-ping (3:20240905-3ubuntu2) over (3:20240905-3ubuntu1) ... 227s Preparing to unpack .../libxml2-16_2.14.5+dfsg-0exp1_amd64.deb ... 227s Unpacking libxml2-16:amd64 (2.14.5+dfsg-0exp1) over (2.14.4+dfsg-0exp1) ... 227s Preparing to unpack .../iputils-tracepath_3%3a20240905-3ubuntu2_amd64.deb ... 227s Unpacking iputils-tracepath (3:20240905-3ubuntu2) over (3:20240905-3ubuntu1) ... 227s Preparing to unpack .../usb.ids_2025.07.26-1_all.deb ... 227s Unpacking usb.ids (2025.07.26-1) over (2025.04.01-1) ... 227s Setting up libxml2-16:amd64 (2.14.5+dfsg-0exp1) ... 227s Setting up usb.ids (2025.07.26-1) ... 227s Setting up iputils-ping (3:20240905-3ubuntu2) ... 227s Setting up iputils-tracepath (3:20240905-3ubuntu2) ... 227s Setting up rsync (3.4.1+ds1-5) ... 228s rsync.service is a disabled or a static unit not running, not starting it. 228s Processing triggers for man-db (2.13.1-1) ... 229s Processing triggers for libc-bin (2.41-6ubuntu2) ... 230s Reading package lists... 230s Building dependency tree... 230s Reading state information... 230s Solving dependencies... 230s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 231s autopkgtest [21:05:38]: rebooting testbed after setup commands that affected boot 254s Reading package lists... 254s Building dependency tree... 254s Reading state information... 254s Solving dependencies... 254s The following NEW packages will be installed: 254s liblzf1 redis-sentinel redis-server redis-tools 254s 0 upgraded, 4 newly installed, 0 to remove and 0 not upgraded. 254s Need to get 1425 kB of archives. 254s After this operation, 7547 kB of additional disk space will be used. 254s Get:1 http://ftpmaster.internal/ubuntu questing/universe amd64 liblzf1 amd64 3.6-4 [7624 B] 254s Get:2 http://ftpmaster.internal/ubuntu questing-proposed/universe amd64 redis-tools amd64 5:8.0.2-3 [1352 kB] 255s Get:3 http://ftpmaster.internal/ubuntu questing-proposed/universe amd64 redis-sentinel amd64 5:8.0.2-3 [12.6 kB] 255s Get:4 http://ftpmaster.internal/ubuntu questing-proposed/universe amd64 redis-server amd64 5:8.0.2-3 [53.2 kB] 255s Fetched 1425 kB in 1s (1786 kB/s) 255s Selecting previously unselected package liblzf1:amd64. 255s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81857 files and directories currently installed.) 255s Preparing to unpack .../liblzf1_3.6-4_amd64.deb ... 255s Unpacking liblzf1:amd64 (3.6-4) ... 255s Selecting previously unselected package redis-tools. 255s Preparing to unpack .../redis-tools_5%3a8.0.2-3_amd64.deb ... 255s Unpacking redis-tools (5:8.0.2-3) ... 255s Selecting previously unselected package redis-sentinel. 255s Preparing to unpack .../redis-sentinel_5%3a8.0.2-3_amd64.deb ... 256s Unpacking redis-sentinel (5:8.0.2-3) ... 256s Selecting previously unselected package redis-server. 256s Preparing to unpack .../redis-server_5%3a8.0.2-3_amd64.deb ... 256s Unpacking redis-server (5:8.0.2-3) ... 256s Setting up liblzf1:amd64 (3.6-4) ... 256s Setting up redis-tools (5:8.0.2-3) ... 256s Setting up redis-server (5:8.0.2-3) ... 256s Created symlink '/etc/systemd/system/redis.service' → '/usr/lib/systemd/system/redis-server.service'. 256s Created symlink '/etc/systemd/system/multi-user.target.wants/redis-server.service' → '/usr/lib/systemd/system/redis-server.service'. 257s Setting up redis-sentinel (5:8.0.2-3) ... 257s Created symlink '/etc/systemd/system/sentinel.service' → '/usr/lib/systemd/system/redis-sentinel.service'. 257s Created symlink '/etc/systemd/system/multi-user.target.wants/redis-sentinel.service' → '/usr/lib/systemd/system/redis-sentinel.service'. 257s Processing triggers for man-db (2.13.1-1) ... 258s Processing triggers for libc-bin (2.41-6ubuntu2) ... 261s autopkgtest [21:06:08]: test 0006-migrate-from-redis: [----------------------- 261s + FLAG_FILE=/etc/valkey/REDIS_MIGRATION 261s + sed -i 's#loglevel notice#loglevel debug#' /etc/redis/redis.conf 261s + systemctl restart redis-server 261s + redis-cli -h 127.0.0.1 -p 6379 SET test 1 261s OK 261s + redis-cli -h 127.0.0.1 -p 6379 GET test 261s 1 261s + redis-cli -h 127.0.0.1 -p 6379 SAVE 261s OK 261s + sha256sum /var/lib/redis/dump.rdb 261s 4a300cfc50d51a6453de07446175351cfd6632dcf173e0beee25ffe10801111a /var/lib/redis/dump.rdb 261s + apt-get install -y valkey-redis-compat 261s Reading package lists... 261s Building dependency tree... 261s Reading state information... 262s Solving dependencies... 262s The following additional packages will be installed: 262s valkey-server valkey-tools 262s Suggested packages: 262s ruby-redis 262s The following packages will be REMOVED: 262s redis-sentinel redis-server redis-tools 262s The following NEW packages will be installed: 262s valkey-redis-compat valkey-server valkey-tools 262s 0 upgraded, 3 newly installed, 3 to remove and 0 not upgraded. 262s Need to get 1359 kB of archives. 262s After this operation, 180 kB disk space will be freed. 262s Get:1 http://ftpmaster.internal/ubuntu questing/universe amd64 valkey-tools amd64 8.1.3+dfsg1-0ubuntu1 [1300 kB] 263s Get:2 http://ftpmaster.internal/ubuntu questing/universe amd64 valkey-server amd64 8.1.3+dfsg1-0ubuntu1 [51.7 kB] 263s Get:3 http://ftpmaster.internal/ubuntu questing/universe amd64 valkey-redis-compat all 8.1.3+dfsg1-0ubuntu1 [7796 B] 263s Fetched 1359 kB in 1s (1713 kB/s) 263s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81906 files and directories currently installed.) 263s Removing redis-sentinel (5:8.0.2-3) ... 263s Removing redis-server (5:8.0.2-3) ... 264s Removing redis-tools (5:8.0.2-3) ... 264s Selecting previously unselected package valkey-tools. 264s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81871 files and directories currently installed.) 264s Preparing to unpack .../valkey-tools_8.1.3+dfsg1-0ubuntu1_amd64.deb ... 264s Unpacking valkey-tools (8.1.3+dfsg1-0ubuntu1) ... 264s Selecting previously unselected package valkey-server. 264s Preparing to unpack .../valkey-server_8.1.3+dfsg1-0ubuntu1_amd64.deb ... 264s Unpacking valkey-server (8.1.3+dfsg1-0ubuntu1) ... 264s Selecting previously unselected package valkey-redis-compat. 264s Preparing to unpack .../valkey-redis-compat_8.1.3+dfsg1-0ubuntu1_all.deb ... 264s Unpacking valkey-redis-compat (8.1.3+dfsg1-0ubuntu1) ... 264s Setting up valkey-tools (8.1.3+dfsg1-0ubuntu1) ... 264s Setting up valkey-server (8.1.3+dfsg1-0ubuntu1) ... 265s Created symlink '/etc/systemd/system/valkey.service' → '/usr/lib/systemd/system/valkey-server.service'. 265s Created symlink '/etc/systemd/system/multi-user.target.wants/valkey-server.service' → '/usr/lib/systemd/system/valkey-server.service'. 265s Setting up valkey-redis-compat (8.1.3+dfsg1-0ubuntu1) ... 265s dpkg-query: no packages found matching valkey-sentinel 265s [I] /etc/redis/redis.conf has been copied to /etc/valkey/valkey.conf. Please, review the content of valkey.conf, especially if you had modified redis.conf. 265s [I] /etc/redis/sentinel.conf has been copied to /etc/valkey/sentinel.conf. Please, review the content of sentinel.conf, especially if you had modified sentinel.conf. 265s [I] On-disk redis dumps moved from /var/lib/redis/ to /var/lib/valkey. 265s Processing triggers for man-db (2.13.1-1) ... 266s + '[' -f /etc/valkey/REDIS_MIGRATION ']' 266s + sha256sum /var/lib/valkey/dump.rdb 266s 152ce26805a2ec99666f23953a87dc338dbc2055fd58e68c6ea75a5c456cdc16 /var/lib/valkey/dump.rdb 266s + systemctl status valkey-server 266s + grep inactive 266s Active: inactive (dead) since Sun 2025-07-27 21:06:12 UTC; 566ms ago 266s + rm /etc/valkey/REDIS_MIGRATION 266s + systemctl start valkey-server 266s Job for valkey-server.service failed because the control process exited with error code. 266s See "systemctl status valkey-server.service" and "journalctl -xeu valkey-server.service" for details. 266s autopkgtest [21:06:13]: test 0006-migrate-from-redis: -----------------------] 267s 0006-migrate-from-redis FAIL non-zero exit status 1 267s autopkgtest [21:06:14]: test 0006-migrate-from-redis: - - - - - - - - - - results - - - - - - - - - - 267s autopkgtest [21:06:14]: @@@@@@@@@@@@@@@@@@@@ summary 267s 0001-valkey-cli PASS 267s 0002-benchmark PASS 267s 0003-valkey-check-aof PASS 267s 0004-valkey-check-rdb PASS 267s 0005-cjson PASS 267s 0006-migrate-from-redis FAIL non-zero exit status 1