0s autopkgtest [08:20:52]: starting date and time: 2025-06-30 08:20:52+0000 0s autopkgtest [08:20:52]: git checkout: 508d4a25 a-v-ssh wait_for_ssh: demote "ssh connection failed" to a debug message 0s autopkgtest [08:20:52]: host juju-7f2275-prod-proposed-migration-environment-9; command line: /home/ubuntu/autopkgtest/runner/autopkgtest --output-dir /tmp/autopkgtest-work.b615d2ro/out --timeout-copy=6000 --setup-commands 'ln -s /dev/null /etc/systemd/system/bluetooth.service; printf "http_proxy=http://squid.internal:3128\nhttps_proxy=http://squid.internal:3128\nno_proxy=127.0.0.1,127.0.1.1,localhost,localdomain,internal,login.ubuntu.com,archive.ubuntu.com,ports.ubuntu.com,security.ubuntu.com,ddebs.ubuntu.com,changelogs.ubuntu.com,keyserver.ubuntu.com,launchpadlibrarian.net,launchpadcontent.net,launchpad.net,keystone.ps5.canonical.com,objectstorage.prodstack5.canonical.com,radosgw.ps5.canonical.com\n" >> /etc/environment' --apt-pocket=proposed=src:redis --apt-upgrade valkey --timeout-short=300 --timeout-copy=20000 --timeout-build=20000 --env=ADT_TEST_TRIGGERS=redis/5:8.0.0-2 -- lxd -r lxd-armhf-10.145.243.254 lxd-armhf-10.145.243.254:autopkgtest/ubuntu/questing/armhf 21s autopkgtest [08:21:13]: testbed dpkg architecture: armhf 22s autopkgtest [08:21:14]: testbed apt version: 3.1.2 26s autopkgtest [08:21:18]: @@@@@@@@@@@@@@@@@@@@ test bed setup 28s autopkgtest [08:21:20]: testbed release detected to be: None 35s autopkgtest [08:21:27]: updating testbed package index (apt update) 37s Get:1 http://ftpmaster.internal/ubuntu questing-proposed InRelease [249 kB] 38s Get:2 http://ftpmaster.internal/ubuntu questing InRelease [249 kB] 38s Get:3 http://ftpmaster.internal/ubuntu questing-updates InRelease [110 kB] 38s Get:4 http://ftpmaster.internal/ubuntu questing-security InRelease [110 kB] 38s Get:5 http://ftpmaster.internal/ubuntu questing-proposed/main Sources [26.6 kB] 38s Get:6 http://ftpmaster.internal/ubuntu questing-proposed/multiverse Sources [17.5 kB] 38s Get:7 http://ftpmaster.internal/ubuntu questing-proposed/universe Sources [429 kB] 38s Get:8 http://ftpmaster.internal/ubuntu questing-proposed/main armhf Packages [32.9 kB] 38s Get:9 http://ftpmaster.internal/ubuntu questing-proposed/universe armhf Packages [359 kB] 38s Get:10 http://ftpmaster.internal/ubuntu questing-proposed/multiverse armhf Packages [3452 B] 38s Get:11 http://ftpmaster.internal/ubuntu questing/universe Sources [21.3 MB] 39s Get:12 http://ftpmaster.internal/ubuntu questing/universe armhf Packages [15.3 MB] 43s Fetched 38.2 MB in 6s (6939 kB/s) 44s Reading package lists... 50s autopkgtest [08:21:42]: upgrading testbed (apt dist-upgrade and autopurge) 51s Reading package lists... 52s Building dependency tree... 52s Reading state information... 52s Calculating upgrade... 53s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 54s Reading package lists... 55s Building dependency tree... 55s Reading state information... 55s Solving dependencies... 56s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 58s autopkgtest [08:21:50]: rebooting testbed after setup commands that affected boot 98s autopkgtest [08:22:30]: testbed running kernel: Linux 6.8.0-58-generic #60~22.04.1-Ubuntu SMP PREEMPT_DYNAMIC Fri Mar 28 14:48:37 UTC 2 123s autopkgtest [08:22:55]: @@@@@@@@@@@@@@@@@@@@ apt-source valkey 145s Get:1 http://ftpmaster.internal/ubuntu questing/universe valkey 8.1.1+dfsg1-2ubuntu1 (dsc) [2484 B] 145s Get:2 http://ftpmaster.internal/ubuntu questing/universe valkey 8.1.1+dfsg1-2ubuntu1 (tar) [2726 kB] 145s Get:3 http://ftpmaster.internal/ubuntu questing/universe valkey 8.1.1+dfsg1-2ubuntu1 (diff) [20.4 kB] 145s gpgv: Signature made Wed Jun 18 14:39:32 2025 UTC 145s gpgv: using RSA key 63EEFC3DE14D5146CE7F24BF34B8AD7D9529E793 145s gpgv: issuer "lena.voytek@canonical.com" 145s gpgv: Can't check signature: No public key 145s dpkg-source: warning: cannot verify inline signature for ./valkey_8.1.1+dfsg1-2ubuntu1.dsc: no acceptable signature found 145s autopkgtest [08:23:17]: testing package valkey version 8.1.1+dfsg1-2ubuntu1 149s autopkgtest [08:23:21]: build not needed 154s autopkgtest [08:23:26]: test 0001-valkey-cli: preparing testbed 156s Reading package lists... 156s Building dependency tree... 156s Reading state information... 156s Solving dependencies... 157s The following NEW packages will be installed: 157s liblzf1 valkey-server valkey-tools 157s 0 upgraded, 3 newly installed, 0 to remove and 0 not upgraded. 157s Need to get 1256 kB of archives. 157s After this operation, 5097 kB of additional disk space will be used. 157s Get:1 http://ftpmaster.internal/ubuntu questing/universe armhf liblzf1 armhf 3.6-4 [6554 B] 157s Get:2 http://ftpmaster.internal/ubuntu questing/universe armhf valkey-tools armhf 8.1.1+dfsg1-2ubuntu1 [1198 kB] 158s Get:3 http://ftpmaster.internal/ubuntu questing/universe armhf valkey-server armhf 8.1.1+dfsg1-2ubuntu1 [51.7 kB] 159s Fetched 1256 kB in 1s (924 kB/s) 159s Selecting previously unselected package liblzf1:armhf. 159s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 59851 files and directories currently installed.) 159s Preparing to unpack .../liblzf1_3.6-4_armhf.deb ... 159s Unpacking liblzf1:armhf (3.6-4) ... 159s Selecting previously unselected package valkey-tools. 159s Preparing to unpack .../valkey-tools_8.1.1+dfsg1-2ubuntu1_armhf.deb ... 159s Unpacking valkey-tools (8.1.1+dfsg1-2ubuntu1) ... 159s Selecting previously unselected package valkey-server. 159s Preparing to unpack .../valkey-server_8.1.1+dfsg1-2ubuntu1_armhf.deb ... 159s Unpacking valkey-server (8.1.1+dfsg1-2ubuntu1) ... 159s Setting up liblzf1:armhf (3.6-4) ... 159s Setting up valkey-tools (8.1.1+dfsg1-2ubuntu1) ... 159s Setting up valkey-server (8.1.1+dfsg1-2ubuntu1) ... 160s Created symlink '/etc/systemd/system/valkey.service' → '/usr/lib/systemd/system/valkey-server.service'. 160s Created symlink '/etc/systemd/system/multi-user.target.wants/valkey-server.service' → '/usr/lib/systemd/system/valkey-server.service'. 160s Processing triggers for man-db (2.13.1-1) ... 161s Processing triggers for libc-bin (2.41-6ubuntu2) ... 169s autopkgtest [08:23:41]: test 0001-valkey-cli: [----------------------- 176s # Server 176s redis_version:7.2.4 176s server_name:valkey 176s valkey_version:8.1.1 176s valkey_release_stage:ga 176s redis_git_sha1:00000000 176s redis_git_dirty:0 176s redis_build_id:454dc2cf719509d2 176s server_mode:standalone 176s os:Linux 6.8.0-58-generic armv7l 176s arch_bits:32 176s monotonic_clock:POSIX clock_gettime 176s multiplexing_api:epoll 176s gcc_version:14.3.0 176s process_id:1080 176s process_supervised:systemd 176s run_id:87923fc20097a3cdf53d5f5f696308ae247dcda1 176s tcp_port:6379 176s server_time_usec:1751271828473870 176s uptime_in_seconds:5 176s uptime_in_days:0 176s hz:10 176s configured_hz:10 176s clients_hz:10 176s lru_clock:6441364 176s executable:/usr/bin/valkey-server 176s config_file:/etc/valkey/valkey.conf 176s io_threads_active:0 176s availability_zone: 176s listener0:name=tcp,bind=127.0.0.1,bind=-::1,port=6379 176s 176s # Clients 176s connected_clients:1 176s cluster_connections:0 176s maxclients:10000 176s client_recent_max_input_buffer:0 176s client_recent_max_output_buffer:0 176s blocked_clients:0 176s tracking_clients:0 176s pubsub_clients:0 176s watching_clients:0 176s clients_in_timeout_table:0 176s total_watched_keys:0 176s total_blocking_keys:0 176s total_blocking_keys_on_nokey:0 176s paused_reason:none 176s paused_actions:none 176s paused_timeout_milliseconds:0 176s 176s # Memory 176s used_memory:737112 176s used_memory_human:719.84K 176s used_memory_rss:10223616 176s used_memory_rss_human:9.75M 176s used_memory_peak:737112 176s used_memory_peak_human:719.84K 176s used_memory_peak_perc:100.35% 176s used_memory_overhead:717712 176s used_memory_startup:717576 176s used_memory_dataset:19400 176s used_memory_dataset_perc:99.30% 176s allocator_allocated:3967936 176s allocator_active:9502720 176s allocator_resident:10289152 176s allocator_muzzy:0 176s total_system_memory:3844059136 176s total_system_memory_human:3.58G 176s used_memory_lua:23552 176s used_memory_vm_eval:23552 176s used_memory_lua_human:23.00K 176s used_memory_scripts_eval:0 176s number_of_cached_scripts:0 176s number_of_functions:0 176s number_of_libraries:0 176s used_memory_vm_functions:24576 176s used_memory_vm_total:48128 176s used_memory_vm_total_human:47.00K 176s used_memory_functions:136 176s used_memory_scripts:136 176s used_memory_scripts_human:136B 176s maxmemory:3221225472 176s maxmemory_human:3.00G 176s maxmemory_policy:noeviction 176s allocator_frag_ratio:1.00 176s allocator_frag_bytes:0 176s allocator_rss_ratio:1.08 176s allocator_rss_bytes:786432 176s rss_overhead_ratio:0.99 176s rss_overhead_bytes:-65536 176s mem_fragmentation_ratio:14.25 176s mem_fragmentation_bytes:9505952 176s mem_not_counted_for_evict:0 176s mem_replication_backlog:0 176s mem_total_replication_buffers:0 176s mem_clients_slaves:0 176s mem_clients_normal:0 176s mem_cluster_links:0 176s mem_aof_buffer:0 176s mem_allocator:jemalloc-5.3.0 176s mem_overhead_db_hashtable_rehashing:0 176s active_defrag_running:0 176s lazyfree_pending_objects:0 176s lazyfreed_objects:0 176s 176s # Persistence 176s loading:0 176s async_loading:0 176s current_cow_peak:0 176s current_cow_size:0 176s current_cow_size_age:0 176s current_fork_perc:0.00 176s current_save_keys_processed:0 176s current_save_keys_total:0 176s rdb_changes_since_last_save:0 176s rdb_bgsave_in_progress:0 176s rdb_last_save_time:1751271823 176s rdb_last_bgsave_status:ok 176s rdb_last_bgsave_time_sec:-1 176s rdb_current_bgsave_time_sec:-1 176s rdb_saves:0 176s rdb_last_cow_size:0 176s rdb_last_load_keys_expired:0 176s rdb_last_load_keys_loaded:0 176s aof_enabled:0 176s aof_rewrite_in_progress:0 176s aof_rewrite_scheduled:0 176s aof_last_rewrite_time_sec:-1 176s aof_current_rewrite_time_sec:-1 176s aof_last_bgrewrite_status:ok 176s aof_rewrites:0 176s aof_rewrites_consecutive_failures:0 176s aof_last_write_status:ok 176s aof_last_cow_size:0 176s module_fork_in_progress:0 176s module_fork_last_cow_size:0 176s 176s # Stats 176s total_connections_received:1 176s total_commands_processed:0 176s instantaneous_ops_per_sec:0 176s total_net_input_bytes:14 176s total_net_output_bytes:0 176s total_net_repl_input_bytes:0 176s total_net_repl_output_bytes:0 176s instantaneous_input_kbps:0.00 176s instantaneous_output_kbps:0.00 176s instantaneous_input_repl_kbps:0.00 176s instantaneous_output_repl_kbps:0.00 176s rejected_connections:0 176s sync_full:0 176s sync_partial_ok:0 176s sync_partial_err:0 176s expired_keys:0 176s expired_stale_perc:0.00 176s expired_time_cap_reached_count:0 176s expire_cycle_cpu_milliseconds:0 176s evicted_keys:0 176s evicted_clients:0 176s evicted_scripts:0 176s total_eviction_exceeded_time:0 176s current_eviction_exceeded_time:0 176s keyspace_hits:0 176s keyspace_misses:0 176s pubsub_channels:0 176s pubsub_patterns:0 176s pubsubshard_channels:0 176s latest_fork_usec:0 176s total_forks:0 176s migrate_cached_sockets:0 176s slave_expires_tracked_keys:0 176s active_defrag_hits:0 176s active_defrag_misses:0 176s active_defrag_key_hits:0 176s active_defrag_key_misses:0 176s total_active_defrag_time:0 176s current_active_defrag_time:0 176s tracking_total_keys:0 176s tracking_total_items:0 176s tracking_total_prefixes:0 176s unexpected_error_replies:0 176s total_error_replies:0 176s dump_payload_sanitizations:0 176s total_reads_processed:1 176s total_writes_processed:0 176s io_threaded_reads_processed:0 176s io_threaded_writes_processed:0 176s io_threaded_freed_objects:0 176s io_threaded_accept_processed:0 176s io_threaded_poll_processed:0 176s io_threaded_total_prefetch_batches:0 176s io_threaded_total_prefetch_entries:0 176s client_query_buffer_limit_disconnections:0 176s client_output_buffer_limit_disconnections:0 176s reply_buffer_shrinks:0 176s reply_buffer_expands:0 176s eventloop_cycles:51 176s eventloop_duration_sum:11419 176s eventloop_duration_cmd_sum:0 176s instantaneous_eventloop_cycles_per_sec:9 176s instantaneous_eventloop_duration_usec:214 176s acl_access_denied_auth:0 176s acl_access_denied_cmd:0 176s acl_access_denied_key:0 176s acl_access_denied_channel:0 176s 176s # Replication 176s role:master 176s connected_slaves:0 176s replicas_waiting_psync:0 176s master_failover_state:no-failover 176s master_replid:d25f33da973ba609be809bb7e8f9ce7eab442a7d 176s master_replid2:0000000000000000000000000000000000000000 176s master_repl_offset:0 176s second_repl_offset:-1 176s repl_backlog_active:0 176s repl_backlog_size:10485760 176s repl_backlog_first_byte_offset:0 176s repl_backlog_histlen:0 176s 176s # CPU 176s used_cpu_sys:0.051742 176s used_cpu_user:0.066408 176s used_cpu_sys_children:0.001551 176s used_cpu_user_children:0.000000 176s used_cpu_sys_main_thread:0.050775 176s used_cpu_user_main_thread:0.066319 176s 176s # Modules 176s 176s # Errorstats 176s 176s # Cluster 176s cluster_enabled:0 176s 176s # Keyspace 176s Redis ver. 8.1.1 176s autopkgtest [08:23:48]: test 0001-valkey-cli: -----------------------] 180s 0001-valkey-cli PASS 180s autopkgtest [08:23:52]: test 0001-valkey-cli: - - - - - - - - - - results - - - - - - - - - - 184s autopkgtest [08:23:56]: test 0002-benchmark: preparing testbed 186s Reading package lists... 186s Building dependency tree... 186s Reading state information... 186s Solving dependencies... 187s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 195s autopkgtest [08:24:07]: test 0002-benchmark: [----------------------- 202s PING_INLINE: rps=0.0 (overall: 0.0) avg_msec=nan (overall: nan) PING_INLINE: rps=378280.0 (overall: 376772.9) avg_msec=1.171 (overall: 1.171) ====== PING_INLINE ====== 202s 100000 requests completed in 0.26 seconds 202s 50 parallel clients 202s 3 bytes payload 202s keep alive: 1 202s host configuration "save": 3600 1 300 100 60 10000 202s host configuration "appendonly": no 202s multi-thread: no 202s 202s Latency by percentile distribution: 202s 0.000% <= 0.367 milliseconds (cumulative count 10) 202s 50.000% <= 1.151 milliseconds (cumulative count 51140) 202s 75.000% <= 1.327 milliseconds (cumulative count 75360) 202s 87.500% <= 1.455 milliseconds (cumulative count 88120) 202s 93.750% <= 1.551 milliseconds (cumulative count 93920) 202s 96.875% <= 1.663 milliseconds (cumulative count 96910) 202s 98.438% <= 1.847 milliseconds (cumulative count 98450) 202s 99.219% <= 2.015 milliseconds (cumulative count 99220) 202s 99.609% <= 2.207 milliseconds (cumulative count 99630) 202s 99.805% <= 2.351 milliseconds (cumulative count 99820) 202s 99.902% <= 2.471 milliseconds (cumulative count 99910) 202s 99.951% <= 2.511 milliseconds (cumulative count 99960) 202s 99.976% <= 2.535 milliseconds (cumulative count 99980) 202s 99.988% <= 2.591 milliseconds (cumulative count 99990) 202s 99.994% <= 2.607 milliseconds (cumulative count 100000) 202s 100.000% <= 2.607 milliseconds (cumulative count 100000) 202s 202s Cumulative distribution of latencies: 202s 0.000% <= 0.103 milliseconds (cumulative count 0) 202s 0.080% <= 0.407 milliseconds (cumulative count 80) 202s 0.310% <= 0.503 milliseconds (cumulative count 310) 202s 0.980% <= 0.607 milliseconds (cumulative count 980) 202s 2.070% <= 0.703 milliseconds (cumulative count 2070) 202s 4.330% <= 0.807 milliseconds (cumulative count 4330) 202s 12.190% <= 0.903 milliseconds (cumulative count 12190) 202s 27.740% <= 1.007 milliseconds (cumulative count 27740) 202s 43.270% <= 1.103 milliseconds (cumulative count 43270) 202s 60.030% <= 1.207 milliseconds (cumulative count 60030) 202s 72.660% <= 1.303 milliseconds (cumulative count 72660) 202s 83.780% <= 1.407 milliseconds (cumulative count 83780) 202s 91.550% <= 1.503 milliseconds (cumulative count 91550) 202s 95.750% <= 1.607 milliseconds (cumulative count 95750) 202s 97.450% <= 1.703 milliseconds (cumulative count 97450) 202s 98.200% <= 1.807 milliseconds (cumulative count 98200) 202s 98.710% <= 1.903 milliseconds (cumulative count 98710) 202s 99.210% <= 2.007 milliseconds (cumulative count 99210) 202s 99.440% <= 2.103 milliseconds (cumulative count 99440) 202s 100.000% <= 3.103 milliseconds (cumulative count 100000) 202s 202s Summary: 202s throughput summary: 377358.50 requests per second 202s latency summary (msec): 202s avg min p50 p95 p99 max 202s 1.169 0.360 1.151 1.583 1.967 2.607 203s PING_MBULK: rps=356374.5 (overall: 382265.0) avg_msec=1.155 (overall: 1.155) ====== PING_MBULK ====== 203s 100000 requests completed in 0.26 seconds 203s 50 parallel clients 203s 3 bytes payload 203s keep alive: 1 203s host configuration "save": 3600 1 300 100 60 10000 203s host configuration "appendonly": no 203s multi-thread: no 203s 203s Latency by percentile distribution: 203s 0.000% <= 0.431 milliseconds (cumulative count 10) 203s 50.000% <= 1.127 milliseconds (cumulative count 50700) 203s 75.000% <= 1.319 milliseconds (cumulative count 75490) 203s 87.500% <= 1.439 milliseconds (cumulative count 87780) 203s 93.750% <= 1.527 milliseconds (cumulative count 94140) 203s 96.875% <= 1.599 milliseconds (cumulative count 96900) 203s 98.438% <= 1.679 milliseconds (cumulative count 98490) 203s 99.219% <= 1.767 milliseconds (cumulative count 99230) 203s 99.609% <= 1.839 milliseconds (cumulative count 99610) 203s 99.805% <= 1.911 milliseconds (cumulative count 99810) 203s 99.902% <= 2.015 milliseconds (cumulative count 99910) 203s 99.951% <= 2.135 milliseconds (cumulative count 99960) 203s 99.976% <= 2.167 milliseconds (cumulative count 99980) 203s 99.988% <= 2.191 milliseconds (cumulative count 99990) 203s 99.994% <= 2.287 milliseconds (cumulative count 100000) 203s 100.000% <= 2.287 milliseconds (cumulative count 100000) 203s 203s Cumulative distribution of latencies: 203s 0.000% <= 0.103 milliseconds (cumulative count 0) 203s 0.130% <= 0.503 milliseconds (cumulative count 130) 203s 0.430% <= 0.607 milliseconds (cumulative count 430) 203s 1.180% <= 0.703 milliseconds (cumulative count 1180) 203s 3.120% <= 0.807 milliseconds (cumulative count 3120) 203s 10.740% <= 0.903 milliseconds (cumulative count 10740) 203s 29.980% <= 1.007 milliseconds (cumulative count 29980) 203s 47.380% <= 1.103 milliseconds (cumulative count 47380) 203s 62.110% <= 1.207 milliseconds (cumulative count 62110) 203s 73.780% <= 1.303 milliseconds (cumulative count 73780) 203s 84.630% <= 1.407 milliseconds (cumulative count 84630) 203s 92.730% <= 1.503 milliseconds (cumulative count 92730) 203s 97.170% <= 1.607 milliseconds (cumulative count 97170) 203s 98.690% <= 1.703 milliseconds (cumulative count 98690) 203s 99.450% <= 1.807 milliseconds (cumulative count 99450) 203s 99.780% <= 1.903 milliseconds (cumulative count 99780) 203s 99.900% <= 2.007 milliseconds (cumulative count 99900) 203s 99.940% <= 2.103 milliseconds (cumulative count 99940) 203s 100.000% <= 3.103 milliseconds (cumulative count 100000) 203s 203s Summary: 203s throughput summary: 383141.75 requests per second 203s latency summary (msec): 203s avg min p50 p95 p99 max 203s 1.155 0.424 1.127 1.551 1.735 2.287 203s SET: rps=245360.0 (overall: 278818.2) avg_msec=1.580 (overall: 1.580) ====== SET ====== 203s 100000 requests completed in 0.36 seconds 203s 50 parallel clients 203s 3 bytes payload 203s keep alive: 1 203s host configuration "save": 3600 1 300 100 60 10000 203s host configuration "appendonly": no 203s multi-thread: no 203s 203s Latency by percentile distribution: 203s 0.000% <= 0.543 milliseconds (cumulative count 10) 203s 50.000% <= 1.567 milliseconds (cumulative count 50070) 203s 75.000% <= 1.815 milliseconds (cumulative count 75110) 203s 87.500% <= 1.991 milliseconds (cumulative count 87610) 203s 93.750% <= 2.103 milliseconds (cumulative count 93990) 203s 96.875% <= 2.207 milliseconds (cumulative count 97080) 203s 98.438% <= 2.279 milliseconds (cumulative count 98440) 203s 99.219% <= 2.367 milliseconds (cumulative count 99230) 203s 99.609% <= 2.447 milliseconds (cumulative count 99620) 203s 99.805% <= 2.567 milliseconds (cumulative count 99810) 203s 99.902% <= 2.695 milliseconds (cumulative count 99920) 203s 99.951% <= 2.815 milliseconds (cumulative count 99960) 203s 99.976% <= 2.927 milliseconds (cumulative count 99980) 203s 99.988% <= 2.983 milliseconds (cumulative count 99990) 203s 99.994% <= 3.039 milliseconds (cumulative count 100000) 203s 100.000% <= 3.039 milliseconds (cumulative count 100000) 203s 203s Cumulative distribution of latencies: 203s 0.000% <= 0.103 milliseconds (cumulative count 0) 203s 0.070% <= 0.607 milliseconds (cumulative count 70) 203s 0.190% <= 0.703 milliseconds (cumulative count 190) 203s 0.600% <= 0.807 milliseconds (cumulative count 600) 203s 1.160% <= 0.903 milliseconds (cumulative count 1160) 203s 1.960% <= 1.007 milliseconds (cumulative count 1960) 203s 3.340% <= 1.103 milliseconds (cumulative count 3340) 203s 7.730% <= 1.207 milliseconds (cumulative count 7730) 203s 17.710% <= 1.303 milliseconds (cumulative count 17710) 203s 31.490% <= 1.407 milliseconds (cumulative count 31490) 203s 42.930% <= 1.503 milliseconds (cumulative count 42930) 203s 54.570% <= 1.607 milliseconds (cumulative count 54570) 203s 64.870% <= 1.703 milliseconds (cumulative count 64870) 203s 74.420% <= 1.807 milliseconds (cumulative count 74420) 203s 81.640% <= 1.903 milliseconds (cumulative count 81640) 203s 88.460% <= 2.007 milliseconds (cumulative count 88460) 203s 93.990% <= 2.103 milliseconds (cumulative count 93990) 203s 100.000% <= 3.103 milliseconds (cumulative count 100000) 203s 203s Summary: 203s throughput summary: 277008.31 requests per second 203s latency summary (msec): 203s avg min p50 p95 p99 max 203s 1.596 0.536 1.567 2.135 2.335 3.039 203s GET: rps=132031.9 (overall: 309719.6) avg_msec=1.390 (overall: 1.390) ====== GET ====== 203s 100000 requests completed in 0.32 seconds 203s 50 parallel clients 203s 3 bytes payload 203s keep alive: 1 203s host configuration "save": 3600 1 300 100 60 10000 203s host configuration "appendonly": no 203s multi-thread: no 203s 203s Latency by percentile distribution: 203s 0.000% <= 0.559 milliseconds (cumulative count 10) 203s 50.000% <= 1.351 milliseconds (cumulative count 50720) 203s 75.000% <= 1.591 milliseconds (cumulative count 75380) 203s 87.500% <= 1.751 milliseconds (cumulative count 87580) 203s 93.750% <= 1.855 milliseconds (cumulative count 94140) 203s 96.875% <= 1.927 milliseconds (cumulative count 96920) 203s 98.438% <= 2.007 milliseconds (cumulative count 98520) 203s 99.219% <= 2.087 milliseconds (cumulative count 99280) 203s 99.609% <= 2.159 milliseconds (cumulative count 99610) 203s 99.805% <= 2.223 milliseconds (cumulative count 99830) 203s 99.902% <= 2.311 milliseconds (cumulative count 99920) 203s 99.951% <= 2.359 milliseconds (cumulative count 99970) 203s 99.976% <= 2.367 milliseconds (cumulative count 99980) 203s 99.988% <= 2.383 milliseconds (cumulative count 99990) 203s 99.994% <= 2.535 milliseconds (cumulative count 100000) 203s 100.000% <= 2.535 milliseconds (cumulative count 100000) 203s 203s Cumulative distribution of latencies: 203s 0.000% <= 0.103 milliseconds (cumulative count 0) 203s 0.020% <= 0.607 milliseconds (cumulative count 20) 203s 0.240% <= 0.703 milliseconds (cumulative count 240) 203s 0.820% <= 0.807 milliseconds (cumulative count 820) 203s 1.730% <= 0.903 milliseconds (cumulative count 1730) 203s 4.640% <= 1.007 milliseconds (cumulative count 4640) 203s 12.390% <= 1.103 milliseconds (cumulative count 12390) 203s 28.540% <= 1.207 milliseconds (cumulative count 28540) 203s 44.420% <= 1.303 milliseconds (cumulative count 44420) 203s 57.440% <= 1.407 milliseconds (cumulative count 57440) 203s 67.630% <= 1.503 milliseconds (cumulative count 67630) 203s 76.770% <= 1.607 milliseconds (cumulative count 76770) 203s 83.990% <= 1.703 milliseconds (cumulative count 83990) 203s 91.430% <= 1.807 milliseconds (cumulative count 91430) 203s 96.200% <= 1.903 milliseconds (cumulative count 96200) 203s 98.520% <= 2.007 milliseconds (cumulative count 98520) 203s 99.350% <= 2.103 milliseconds (cumulative count 99350) 203s 100.000% <= 3.103 milliseconds (cumulative count 100000) 203s 203s Summary: 203s throughput summary: 313479.62 requests per second 203s latency summary (msec): 203s avg min p50 p95 p99 max 203s 1.392 0.552 1.351 1.879 2.055 2.535 204s INCR: rps=40920.0 (overall: 292285.7) avg_msec=1.435 (overall: 1.435) INCR: rps=307290.8 (overall: 305454.5) avg_msec=1.431 (overall: 1.431) ====== INCR ====== 204s 100000 requests completed in 0.33 seconds 204s 50 parallel clients 204s 3 bytes payload 204s keep alive: 1 204s host configuration "save": 3600 1 300 100 60 10000 204s host configuration "appendonly": no 204s multi-thread: no 204s 204s Latency by percentile distribution: 204s 0.000% <= 0.511 milliseconds (cumulative count 10) 204s 50.000% <= 1.391 milliseconds (cumulative count 50210) 204s 75.000% <= 1.639 milliseconds (cumulative count 75280) 204s 87.500% <= 1.807 milliseconds (cumulative count 88080) 204s 93.750% <= 1.903 milliseconds (cumulative count 93810) 204s 96.875% <= 1.991 milliseconds (cumulative count 97130) 204s 98.438% <= 2.071 milliseconds (cumulative count 98450) 204s 99.219% <= 2.159 milliseconds (cumulative count 99250) 204s 99.609% <= 2.239 milliseconds (cumulative count 99620) 204s 99.805% <= 2.327 milliseconds (cumulative count 99810) 204s 99.902% <= 2.423 milliseconds (cumulative count 99920) 204s 99.951% <= 2.487 milliseconds (cumulative count 99970) 204s 99.976% <= 2.495 milliseconds (cumulative count 99980) 204s 99.988% <= 2.543 milliseconds (cumulative count 99990) 204s 99.994% <= 2.647 milliseconds (cumulative count 100000) 204s 100.000% <= 2.647 milliseconds (cumulative count 100000) 204s 204s Cumulative distribution of latencies: 204s 0.000% <= 0.103 milliseconds (cumulative count 0) 204s 0.100% <= 0.607 milliseconds (cumulative count 100) 204s 0.570% <= 0.703 milliseconds (cumulative count 570) 204s 1.380% <= 0.807 milliseconds (cumulative count 1380) 204s 2.350% <= 0.903 milliseconds (cumulative count 2350) 204s 4.820% <= 1.007 milliseconds (cumulative count 4820) 204s 10.360% <= 1.103 milliseconds (cumulative count 10360) 204s 23.370% <= 1.207 milliseconds (cumulative count 23370) 204s 38.190% <= 1.303 milliseconds (cumulative count 38190) 204s 52.100% <= 1.407 milliseconds (cumulative count 52100) 204s 62.850% <= 1.503 milliseconds (cumulative count 62850) 204s 72.660% <= 1.607 milliseconds (cumulative count 72660) 204s 80.470% <= 1.703 milliseconds (cumulative count 80470) 204s 88.080% <= 1.807 milliseconds (cumulative count 88080) 204s 93.810% <= 1.903 milliseconds (cumulative count 93810) 204s 97.380% <= 2.007 milliseconds (cumulative count 97380) 204s 98.830% <= 2.103 milliseconds (cumulative count 98830) 204s 100.000% <= 3.103 milliseconds (cumulative count 100000) 204s 204s Summary: 204s throughput summary: 305810.41 requests per second 204s latency summary (msec): 204s avg min p50 p95 p99 max 204s 1.430 0.504 1.391 1.935 2.127 2.647 204s LPUSH: rps=211440.0 (overall: 256601.9) avg_msec=1.730 (overall: 1.730) ====== LPUSH ====== 204s 100000 requests completed in 0.39 seconds 204s 50 parallel clients 204s 3 bytes payload 204s keep alive: 1 204s host configuration "save": 3600 1 300 100 60 10000 204s host configuration "appendonly": no 204s multi-thread: no 204s 204s Latency by percentile distribution: 204s 0.000% <= 0.687 milliseconds (cumulative count 10) 204s 50.000% <= 1.743 milliseconds (cumulative count 50830) 204s 75.000% <= 1.991 milliseconds (cumulative count 75210) 204s 87.500% <= 2.159 milliseconds (cumulative count 87570) 204s 93.750% <= 2.271 milliseconds (cumulative count 93830) 204s 96.875% <= 2.359 milliseconds (cumulative count 96910) 204s 98.438% <= 2.431 milliseconds (cumulative count 98530) 204s 99.219% <= 2.495 milliseconds (cumulative count 99260) 204s 99.609% <= 2.551 milliseconds (cumulative count 99620) 204s 99.805% <= 2.607 milliseconds (cumulative count 99820) 204s 99.902% <= 2.647 milliseconds (cumulative count 99920) 204s 99.951% <= 2.703 milliseconds (cumulative count 99960) 204s 99.976% <= 2.751 milliseconds (cumulative count 99980) 204s 99.988% <= 2.775 milliseconds (cumulative count 99990) 204s 99.994% <= 2.799 milliseconds (cumulative count 100000) 204s 100.000% <= 2.799 milliseconds (cumulative count 100000) 204s 204s Cumulative distribution of latencies: 204s 0.000% <= 0.103 milliseconds (cumulative count 0) 204s 0.010% <= 0.703 milliseconds (cumulative count 10) 204s 0.070% <= 0.807 milliseconds (cumulative count 70) 204s 0.290% <= 0.903 milliseconds (cumulative count 290) 204s 0.740% <= 1.007 milliseconds (cumulative count 740) 204s 1.680% <= 1.103 milliseconds (cumulative count 1680) 204s 4.020% <= 1.207 milliseconds (cumulative count 4020) 204s 8.310% <= 1.303 milliseconds (cumulative count 8310) 204s 16.430% <= 1.407 milliseconds (cumulative count 16430) 204s 26.310% <= 1.503 milliseconds (cumulative count 26310) 204s 36.800% <= 1.607 milliseconds (cumulative count 36800) 204s 46.530% <= 1.703 milliseconds (cumulative count 46530) 204s 57.480% <= 1.807 milliseconds (cumulative count 57480) 204s 67.130% <= 1.903 milliseconds (cumulative count 67130) 204s 76.470% <= 2.007 milliseconds (cumulative count 76470) 204s 83.810% <= 2.103 milliseconds (cumulative count 83810) 204s 100.000% <= 3.103 milliseconds (cumulative count 100000) 204s 204s Summary: 204s throughput summary: 255102.05 requests per second 204s latency summary (msec): 204s avg min p50 p95 p99 max 204s 1.747 0.680 1.743 2.303 2.471 2.799 204s RPUSH: rps=70159.4 (overall: 284032.2) avg_msec=1.527 (overall: 1.527) RPUSH: rps=289600.0 (overall: 288493.6) avg_msec=1.537 (overall: 1.535) ====== RPUSH ====== 204s 100000 requests completed in 0.35 seconds 204s 50 parallel clients 204s 3 bytes payload 204s keep alive: 1 204s host configuration "save": 3600 1 300 100 60 10000 204s host configuration "appendonly": no 204s multi-thread: no 204s 204s Latency by percentile distribution: 204s 0.000% <= 0.535 milliseconds (cumulative count 20) 204s 50.000% <= 1.495 milliseconds (cumulative count 50450) 204s 75.000% <= 1.751 milliseconds (cumulative count 75510) 204s 87.500% <= 1.919 milliseconds (cumulative count 88010) 204s 93.750% <= 2.015 milliseconds (cumulative count 94070) 204s 96.875% <= 2.087 milliseconds (cumulative count 96940) 204s 98.438% <= 2.167 milliseconds (cumulative count 98530) 204s 99.219% <= 2.231 milliseconds (cumulative count 99290) 204s 99.609% <= 2.287 milliseconds (cumulative count 99620) 204s 99.805% <= 2.359 milliseconds (cumulative count 99820) 204s 99.902% <= 2.431 milliseconds (cumulative count 99920) 204s 99.951% <= 2.487 milliseconds (cumulative count 99960) 204s 99.976% <= 2.503 milliseconds (cumulative count 99980) 204s 99.988% <= 2.559 milliseconds (cumulative count 99990) 204s 99.994% <= 2.639 milliseconds (cumulative count 100000) 204s 100.000% <= 2.639 milliseconds (cumulative count 100000) 204s 204s Cumulative distribution of latencies: 204s 0.000% <= 0.103 milliseconds (cumulative count 0) 204s 0.090% <= 0.607 milliseconds (cumulative count 90) 204s 0.300% <= 0.703 milliseconds (cumulative count 300) 204s 0.760% <= 0.807 milliseconds (cumulative count 760) 204s 1.170% <= 0.903 milliseconds (cumulative count 1170) 204s 1.880% <= 1.007 milliseconds (cumulative count 1880) 204s 3.680% <= 1.103 milliseconds (cumulative count 3680) 204s 11.460% <= 1.207 milliseconds (cumulative count 11460) 204s 25.100% <= 1.303 milliseconds (cumulative count 25100) 204s 40.010% <= 1.407 milliseconds (cumulative count 40010) 204s 51.510% <= 1.503 milliseconds (cumulative count 51510) 204s 62.590% <= 1.607 milliseconds (cumulative count 62590) 204s 71.460% <= 1.703 milliseconds (cumulative count 71460) 204s 79.820% <= 1.807 milliseconds (cumulative count 79820) 204s 86.860% <= 1.903 milliseconds (cumulative count 86860) 204s 93.640% <= 2.007 milliseconds (cumulative count 93640) 204s 97.320% <= 2.103 milliseconds (cumulative count 97320) 204s 100.000% <= 3.103 milliseconds (cumulative count 100000) 204s 204s Summary: 204s throughput summary: 289017.34 requests per second 204s latency summary (msec): 204s avg min p50 p95 p99 max 204s 1.530 0.528 1.495 2.039 2.199 2.639 205s LPOP: rps=219203.2 (overall: 258309.9) avg_msec=1.712 (overall: 1.712) ====== LPOP ====== 205s 100000 requests completed in 0.39 seconds 205s 50 parallel clients 205s 3 bytes payload 205s keep alive: 1 205s host configuration "save": 3600 1 300 100 60 10000 205s host configuration "appendonly": no 205s multi-thread: no 205s 205s Latency by percentile distribution: 205s 0.000% <= 0.599 milliseconds (cumulative count 10) 205s 50.000% <= 1.703 milliseconds (cumulative count 50780) 205s 75.000% <= 1.967 milliseconds (cumulative count 75060) 205s 87.500% <= 2.143 milliseconds (cumulative count 87860) 205s 93.750% <= 2.263 milliseconds (cumulative count 93870) 205s 96.875% <= 2.359 milliseconds (cumulative count 96890) 205s 98.438% <= 2.463 milliseconds (cumulative count 98460) 205s 99.219% <= 2.575 milliseconds (cumulative count 99220) 205s 99.609% <= 2.711 milliseconds (cumulative count 99610) 205s 99.805% <= 2.879 milliseconds (cumulative count 99810) 205s 99.902% <= 3.023 milliseconds (cumulative count 99910) 205s 99.951% <= 3.191 milliseconds (cumulative count 99960) 205s 99.976% <= 3.327 milliseconds (cumulative count 99980) 205s 99.988% <= 3.375 milliseconds (cumulative count 99990) 205s 99.994% <= 3.791 milliseconds (cumulative count 100000) 205s 100.000% <= 3.791 milliseconds (cumulative count 100000) 205s 205s Cumulative distribution of latencies: 205s 0.000% <= 0.103 milliseconds (cumulative count 0) 205s 0.010% <= 0.607 milliseconds (cumulative count 10) 205s 0.100% <= 0.703 milliseconds (cumulative count 100) 205s 0.330% <= 0.807 milliseconds (cumulative count 330) 205s 0.660% <= 0.903 milliseconds (cumulative count 660) 205s 1.070% <= 1.007 milliseconds (cumulative count 1070) 205s 1.750% <= 1.103 milliseconds (cumulative count 1750) 205s 3.750% <= 1.207 milliseconds (cumulative count 3750) 205s 9.100% <= 1.303 milliseconds (cumulative count 9100) 205s 18.970% <= 1.407 milliseconds (cumulative count 18970) 205s 29.220% <= 1.503 milliseconds (cumulative count 29220) 205s 40.690% <= 1.607 milliseconds (cumulative count 40690) 205s 50.780% <= 1.703 milliseconds (cumulative count 50780) 205s 61.350% <= 1.807 milliseconds (cumulative count 61350) 205s 69.900% <= 1.903 milliseconds (cumulative count 69900) 205s 78.200% <= 2.007 milliseconds (cumulative count 78200) 205s 85.110% <= 2.103 milliseconds (cumulative count 85110) 205s 99.930% <= 3.103 milliseconds (cumulative count 99930) 205s 100.000% <= 4.103 milliseconds (cumulative count 100000) 205s 205s Summary: 205s throughput summary: 259067.36 requests per second 205s latency summary (msec): 205s avg min p50 p95 p99 max 205s 1.723 0.592 1.703 2.295 2.535 3.791 205s RPOP: rps=79640.0 (overall: 269054.1) avg_msec=1.639 (overall: 1.639) RPOP: rps=272270.9 (overall: 271538.5) avg_msec=1.635 (overall: 1.636) ====== RPOP ====== 205s 100000 requests completed in 0.37 seconds 205s 50 parallel clients 205s 3 bytes payload 205s keep alive: 1 205s host configuration "save": 3600 1 300 100 60 10000 205s host configuration "appendonly": no 205s multi-thread: no 205s 205s Latency by percentile distribution: 205s 0.000% <= 0.559 milliseconds (cumulative count 10) 205s 50.000% <= 1.607 milliseconds (cumulative count 50930) 205s 75.000% <= 1.863 milliseconds (cumulative count 75180) 205s 87.500% <= 2.039 milliseconds (cumulative count 87570) 205s 93.750% <= 2.159 milliseconds (cumulative count 94060) 205s 96.875% <= 2.247 milliseconds (cumulative count 96930) 205s 98.438% <= 2.335 milliseconds (cumulative count 98500) 205s 99.219% <= 2.399 milliseconds (cumulative count 99220) 205s 99.609% <= 2.495 milliseconds (cumulative count 99620) 205s 99.805% <= 2.559 milliseconds (cumulative count 99820) 205s 99.902% <= 2.639 milliseconds (cumulative count 99910) 205s 99.951% <= 2.719 milliseconds (cumulative count 99970) 205s 99.976% <= 2.727 milliseconds (cumulative count 99990) 205s 99.994% <= 2.751 milliseconds (cumulative count 100000) 205s 100.000% <= 2.751 milliseconds (cumulative count 100000) 205s 205s Cumulative distribution of latencies: 205s 0.000% <= 0.103 milliseconds (cumulative count 0) 205s 0.110% <= 0.607 milliseconds (cumulative count 110) 205s 0.280% <= 0.703 milliseconds (cumulative count 280) 205s 0.600% <= 0.807 milliseconds (cumulative count 600) 205s 1.000% <= 0.903 milliseconds (cumulative count 1000) 205s 1.770% <= 1.007 milliseconds (cumulative count 1770) 205s 2.920% <= 1.103 milliseconds (cumulative count 2920) 205s 5.790% <= 1.207 milliseconds (cumulative count 5790) 205s 13.500% <= 1.303 milliseconds (cumulative count 13500) 205s 26.650% <= 1.407 milliseconds (cumulative count 26650) 205s 38.840% <= 1.503 milliseconds (cumulative count 38840) 205s 50.930% <= 1.607 milliseconds (cumulative count 50930) 205s 61.120% <= 1.703 milliseconds (cumulative count 61120) 205s 70.670% <= 1.807 milliseconds (cumulative count 70670) 205s 78.210% <= 1.903 milliseconds (cumulative count 78210) 205s 85.460% <= 2.007 milliseconds (cumulative count 85460) 205s 91.320% <= 2.103 milliseconds (cumulative count 91320) 205s 100.000% <= 3.103 milliseconds (cumulative count 100000) 205s 205s Summary: 205s throughput summary: 271739.12 requests per second 205s latency summary (msec): 205s avg min p50 p95 p99 max 205s 1.634 0.552 1.607 2.183 2.383 2.751 205s SADD: rps=274200.0 (overall: 336029.4) avg_msec=1.301 (overall: 1.301) ====== SADD ====== 205s 100000 requests completed in 0.29 seconds 205s 50 parallel clients 205s 3 bytes payload 205s keep alive: 1 205s host configuration "save": 3600 1 300 100 60 10000 205s host configuration "appendonly": no 205s multi-thread: no 205s 205s Latency by percentile distribution: 205s 0.000% <= 0.359 milliseconds (cumulative count 10) 205s 50.000% <= 1.287 milliseconds (cumulative count 50430) 205s 75.000% <= 1.455 milliseconds (cumulative count 75510) 205s 87.500% <= 1.567 milliseconds (cumulative count 87550) 205s 93.750% <= 1.687 milliseconds (cumulative count 93920) 205s 96.875% <= 1.807 milliseconds (cumulative count 96950) 205s 98.438% <= 1.943 milliseconds (cumulative count 98500) 205s 99.219% <= 2.071 milliseconds (cumulative count 99230) 205s 99.609% <= 2.263 milliseconds (cumulative count 99620) 205s 99.805% <= 2.559 milliseconds (cumulative count 99810) 205s 99.902% <= 2.847 milliseconds (cumulative count 99910) 205s 99.951% <= 3.223 milliseconds (cumulative count 99960) 205s 99.976% <= 3.271 milliseconds (cumulative count 99980) 205s 99.988% <= 3.311 milliseconds (cumulative count 99990) 205s 99.994% <= 3.327 milliseconds (cumulative count 100000) 205s 100.000% <= 3.327 milliseconds (cumulative count 100000) 205s 205s Cumulative distribution of latencies: 205s 0.000% <= 0.103 milliseconds (cumulative count 0) 205s 0.080% <= 0.407 milliseconds (cumulative count 80) 205s 0.300% <= 0.503 milliseconds (cumulative count 300) 205s 0.720% <= 0.607 milliseconds (cumulative count 720) 205s 1.220% <= 0.703 milliseconds (cumulative count 1220) 205s 2.240% <= 0.807 milliseconds (cumulative count 2240) 205s 6.480% <= 0.903 milliseconds (cumulative count 6480) 205s 14.950% <= 1.007 milliseconds (cumulative count 14950) 205s 24.430% <= 1.103 milliseconds (cumulative count 24430) 205s 37.850% <= 1.207 milliseconds (cumulative count 37850) 205s 52.950% <= 1.303 milliseconds (cumulative count 52950) 205s 68.980% <= 1.407 milliseconds (cumulative count 68980) 205s 81.420% <= 1.503 milliseconds (cumulative count 81420) 205s 90.280% <= 1.607 milliseconds (cumulative count 90280) 205s 94.380% <= 1.703 milliseconds (cumulative count 94380) 205s 96.950% <= 1.807 milliseconds (cumulative count 96950) 205s 98.170% <= 1.903 milliseconds (cumulative count 98170) 205s 98.980% <= 2.007 milliseconds (cumulative count 98980) 205s 99.340% <= 2.103 milliseconds (cumulative count 99340) 205s 99.950% <= 3.103 milliseconds (cumulative count 99950) 205s 100.000% <= 4.103 milliseconds (cumulative count 100000) 205s 205s Summary: 205s throughput summary: 342465.75 requests per second 205s latency summary (msec): 205s avg min p50 p95 p99 max 205s 1.288 0.352 1.287 1.727 2.015 3.327 206s HSET: rps=205298.8 (overall: 322062.5) avg_msec=1.398 (overall: 1.398) ====== HSET ====== 206s 100000 requests completed in 0.31 seconds 206s 50 parallel clients 206s 3 bytes payload 206s keep alive: 1 206s host configuration "save": 3600 1 300 100 60 10000 206s host configuration "appendonly": no 206s multi-thread: no 206s 206s Latency by percentile distribution: 206s 0.000% <= 0.487 milliseconds (cumulative count 10) 206s 50.000% <= 1.407 milliseconds (cumulative count 50700) 206s 75.000% <= 1.575 milliseconds (cumulative count 75020) 206s 87.500% <= 1.687 milliseconds (cumulative count 87920) 206s 93.750% <= 1.767 milliseconds (cumulative count 94140) 206s 96.875% <= 1.831 milliseconds (cumulative count 97100) 206s 98.438% <= 1.903 milliseconds (cumulative count 98460) 206s 99.219% <= 1.967 milliseconds (cumulative count 99220) 206s 99.609% <= 2.039 milliseconds (cumulative count 99640) 206s 99.805% <= 2.095 milliseconds (cumulative count 99860) 206s 99.902% <= 2.151 milliseconds (cumulative count 99920) 206s 99.951% <= 2.199 milliseconds (cumulative count 99960) 206s 99.976% <= 2.215 milliseconds (cumulative count 99990) 206s 99.994% <= 2.303 milliseconds (cumulative count 100000) 206s 100.000% <= 2.303 milliseconds (cumulative count 100000) 206s 206s Cumulative distribution of latencies: 206s 0.000% <= 0.103 milliseconds (cumulative count 0) 206s 0.010% <= 0.503 milliseconds (cumulative count 10) 206s 0.070% <= 0.607 milliseconds (cumulative count 70) 206s 0.170% <= 0.703 milliseconds (cumulative count 170) 206s 0.560% <= 0.807 milliseconds (cumulative count 560) 206s 1.480% <= 0.903 milliseconds (cumulative count 1480) 206s 5.850% <= 1.007 milliseconds (cumulative count 5850) 206s 13.610% <= 1.103 milliseconds (cumulative count 13610) 206s 23.630% <= 1.207 milliseconds (cumulative count 23630) 206s 35.710% <= 1.303 milliseconds (cumulative count 35710) 206s 50.700% <= 1.407 milliseconds (cumulative count 50700) 206s 65.260% <= 1.503 milliseconds (cumulative count 65260) 206s 79.030% <= 1.607 milliseconds (cumulative count 79030) 206s 89.380% <= 1.703 milliseconds (cumulative count 89380) 206s 96.220% <= 1.807 milliseconds (cumulative count 96220) 206s 98.460% <= 1.903 milliseconds (cumulative count 98460) 206s 99.490% <= 2.007 milliseconds (cumulative count 99490) 206s 99.860% <= 2.103 milliseconds (cumulative count 99860) 206s 100.000% <= 3.103 milliseconds (cumulative count 100000) 206s 206s Summary: 206s throughput summary: 322580.66 requests per second 206s latency summary (msec): 206s avg min p50 p95 p99 max 206s 1.397 0.480 1.407 1.783 1.951 2.303 206s SPOP: rps=147729.1 (overall: 374545.5) avg_msec=1.168 (overall: 1.168) ====== SPOP ====== 206s 100000 requests completed in 0.26 seconds 206s 50 parallel clients 206s 3 bytes payload 206s keep alive: 1 206s host configuration "save": 3600 1 300 100 60 10000 206s host configuration "appendonly": no 206s multi-thread: no 206s 206s Latency by percentile distribution: 206s 0.000% <= 0.351 milliseconds (cumulative count 10) 206s 50.000% <= 1.127 milliseconds (cumulative count 50090) 206s 75.000% <= 1.295 milliseconds (cumulative count 75700) 206s 87.500% <= 1.399 milliseconds (cumulative count 87890) 206s 93.750% <= 1.471 milliseconds (cumulative count 93930) 206s 96.875% <= 1.551 milliseconds (cumulative count 97080) 206s 98.438% <= 1.631 milliseconds (cumulative count 98490) 206s 99.219% <= 1.719 milliseconds (cumulative count 99220) 206s 99.609% <= 1.831 milliseconds (cumulative count 99620) 206s 99.805% <= 2.199 milliseconds (cumulative count 99810) 206s 99.902% <= 2.359 milliseconds (cumulative count 99910) 206s 99.951% <= 2.455 milliseconds (cumulative count 99960) 206s 99.976% <= 2.503 milliseconds (cumulative count 99980) 206s 99.988% <= 2.543 milliseconds (cumulative count 99990) 206s 99.994% <= 2.575 milliseconds (cumulative count 100000) 206s 100.000% <= 2.575 milliseconds (cumulative count 100000) 206s 206s Cumulative distribution of latencies: 206s 0.000% <= 0.103 milliseconds (cumulative count 0) 206s 0.050% <= 0.407 milliseconds (cumulative count 50) 206s 0.270% <= 0.503 milliseconds (cumulative count 270) 206s 0.510% <= 0.607 milliseconds (cumulative count 510) 206s 0.990% <= 0.703 milliseconds (cumulative count 990) 206s 3.550% <= 0.807 milliseconds (cumulative count 3550) 206s 13.380% <= 0.903 milliseconds (cumulative count 13380) 206s 30.730% <= 1.007 milliseconds (cumulative count 30730) 206s 46.200% <= 1.103 milliseconds (cumulative count 46200) 206s 63.070% <= 1.207 milliseconds (cumulative count 63070) 206s 76.860% <= 1.303 milliseconds (cumulative count 76860) 206s 88.800% <= 1.407 milliseconds (cumulative count 88800) 206s 95.620% <= 1.503 milliseconds (cumulative count 95620) 206s 98.240% <= 1.607 milliseconds (cumulative count 98240) 206s 99.100% <= 1.703 milliseconds (cumulative count 99100) 206s 99.570% <= 1.807 milliseconds (cumulative count 99570) 206s 99.710% <= 1.903 milliseconds (cumulative count 99710) 206s 99.780% <= 2.007 milliseconds (cumulative count 99780) 206s 99.790% <= 2.103 milliseconds (cumulative count 99790) 206s 100.000% <= 3.103 milliseconds (cumulative count 100000) 206s 206s Summary: 206s throughput summary: 389105.06 requests per second 206s latency summary (msec): 206s avg min p50 p95 p99 max 206s 1.140 0.344 1.127 1.495 1.687 2.575 206s ZADD: rps=93320.0 (overall: 262134.8) avg_msec=1.677 (overall: 1.677) ZADD: rps=276932.3 (overall: 273058.8) avg_msec=1.597 (overall: 1.617) ====== ZADD ====== 206s 100000 requests completed in 0.37 seconds 206s 50 parallel clients 206s 3 bytes payload 206s keep alive: 1 206s host configuration "save": 3600 1 300 100 60 10000 206s host configuration "appendonly": no 206s multi-thread: no 206s 206s Latency by percentile distribution: 206s 0.000% <= 0.415 milliseconds (cumulative count 10) 206s 50.000% <= 1.567 milliseconds (cumulative count 50150) 206s 75.000% <= 1.847 milliseconds (cumulative count 75480) 206s 87.500% <= 2.023 milliseconds (cumulative count 87830) 206s 93.750% <= 2.135 milliseconds (cumulative count 93960) 206s 96.875% <= 2.247 milliseconds (cumulative count 96910) 206s 98.438% <= 2.375 milliseconds (cumulative count 98460) 206s 99.219% <= 2.511 milliseconds (cumulative count 99230) 206s 99.609% <= 2.623 milliseconds (cumulative count 99610) 206s 99.805% <= 2.991 milliseconds (cumulative count 99810) 206s 99.902% <= 3.255 milliseconds (cumulative count 99910) 206s 99.951% <= 3.455 milliseconds (cumulative count 99960) 206s 99.976% <= 3.559 milliseconds (cumulative count 99980) 206s 99.988% <= 3.607 milliseconds (cumulative count 99990) 206s 99.994% <= 3.671 milliseconds (cumulative count 100000) 206s 100.000% <= 3.671 milliseconds (cumulative count 100000) 206s 206s Cumulative distribution of latencies: 206s 0.000% <= 0.103 milliseconds (cumulative count 0) 206s 0.080% <= 0.503 milliseconds (cumulative count 80) 206s 0.200% <= 0.607 milliseconds (cumulative count 200) 206s 0.360% <= 0.703 milliseconds (cumulative count 360) 206s 0.730% <= 0.807 milliseconds (cumulative count 730) 206s 1.110% <= 0.903 milliseconds (cumulative count 1110) 206s 1.840% <= 1.007 milliseconds (cumulative count 1840) 206s 3.220% <= 1.103 milliseconds (cumulative count 3220) 206s 6.550% <= 1.207 milliseconds (cumulative count 6550) 206s 16.090% <= 1.303 milliseconds (cumulative count 16090) 206s 30.690% <= 1.407 milliseconds (cumulative count 30690) 206s 42.960% <= 1.503 milliseconds (cumulative count 42960) 206s 54.450% <= 1.607 milliseconds (cumulative count 54450) 206s 63.850% <= 1.703 milliseconds (cumulative count 63850) 206s 72.660% <= 1.807 milliseconds (cumulative count 72660) 206s 79.610% <= 1.903 milliseconds (cumulative count 79610) 206s 86.740% <= 2.007 milliseconds (cumulative count 86740) 206s 92.560% <= 2.103 milliseconds (cumulative count 92560) 206s 99.860% <= 3.103 milliseconds (cumulative count 99860) 206s 100.000% <= 4.103 milliseconds (cumulative count 100000) 206s 206s Summary: 206s throughput summary: 273972.59 requests per second 206s latency summary (msec): 206s avg min p50 p95 p99 max 206s 1.613 0.408 1.567 2.167 2.463 3.671 207s ZPOPMIN: rps=293040.0 (overall: 330000.0) avg_msec=1.322 (overall: 1.322) ====== ZPOPMIN ====== 207s 100000 requests completed in 0.30 seconds 207s 50 parallel clients 207s 3 bytes payload 207s keep alive: 1 207s host configuration "save": 3600 1 300 100 60 10000 207s host configuration "appendonly": no 207s multi-thread: no 207s 207s Latency by percentile distribution: 207s 0.000% <= 0.447 milliseconds (cumulative count 10) 207s 50.000% <= 1.279 milliseconds (cumulative count 50990) 207s 75.000% <= 1.511 milliseconds (cumulative count 75330) 207s 87.500% <= 1.671 milliseconds (cumulative count 87510) 207s 93.750% <= 1.767 milliseconds (cumulative count 94250) 207s 96.875% <= 1.831 milliseconds (cumulative count 96980) 207s 98.438% <= 1.911 milliseconds (cumulative count 98490) 207s 99.219% <= 1.959 milliseconds (cumulative count 99240) 207s 99.609% <= 2.007 milliseconds (cumulative count 99630) 207s 99.805% <= 2.063 milliseconds (cumulative count 99820) 207s 99.902% <= 2.119 milliseconds (cumulative count 99910) 207s 99.951% <= 2.167 milliseconds (cumulative count 99960) 207s 99.976% <= 2.263 milliseconds (cumulative count 99980) 207s 99.988% <= 2.271 milliseconds (cumulative count 99990) 207s 99.994% <= 2.303 milliseconds (cumulative count 100000) 207s 100.000% <= 2.303 milliseconds (cumulative count 100000) 207s 207s Cumulative distribution of latencies: 207s 0.000% <= 0.103 milliseconds (cumulative count 0) 207s 0.030% <= 0.503 milliseconds (cumulative count 30) 207s 0.330% <= 0.607 milliseconds (cumulative count 330) 207s 0.920% <= 0.703 milliseconds (cumulative count 920) 207s 1.850% <= 0.807 milliseconds (cumulative count 1850) 207s 3.350% <= 0.903 milliseconds (cumulative count 3350) 207s 7.350% <= 1.007 milliseconds (cumulative count 7350) 207s 19.120% <= 1.103 milliseconds (cumulative count 19120) 207s 39.840% <= 1.207 milliseconds (cumulative count 39840) 207s 54.120% <= 1.303 milliseconds (cumulative count 54120) 207s 65.820% <= 1.407 milliseconds (cumulative count 65820) 207s 74.580% <= 1.503 milliseconds (cumulative count 74580) 207s 82.780% <= 1.607 milliseconds (cumulative count 82780) 207s 89.860% <= 1.703 milliseconds (cumulative count 89860) 207s 96.280% <= 1.807 milliseconds (cumulative count 96280) 207s 98.330% <= 1.903 milliseconds (cumulative count 98330) 207s 99.630% <= 2.007 milliseconds (cumulative count 99630) 207s 99.900% <= 2.103 milliseconds (cumulative count 99900) 207s 100.000% <= 3.103 milliseconds (cumulative count 100000) 207s 207s Summary: 207s throughput summary: 331125.84 requests per second 207s latency summary (msec): 207s avg min p50 p95 p99 max 207s 1.322 0.440 1.279 1.783 1.951 2.303 207s LPUSH (needed to benchmark LRANGE): rps=172948.2 (overall: 258392.9) avg_msec=1.697 (overall: 1.697) ====== LPUSH (needed to benchmark LRANGE) ====== 207s 100000 requests completed in 0.38 seconds 207s 50 parallel clients 207s 3 bytes payload 207s keep alive: 1 207s host configuration "save": 3600 1 300 100 60 10000 207s host configuration "appendonly": no 207s multi-thread: no 207s 207s Latency by percentile distribution: 207s 0.000% <= 0.671 milliseconds (cumulative count 20) 207s 50.000% <= 1.671 milliseconds (cumulative count 50530) 207s 75.000% <= 1.927 milliseconds (cumulative count 75180) 207s 87.500% <= 2.095 milliseconds (cumulative count 87710) 207s 93.750% <= 2.207 milliseconds (cumulative count 93950) 207s 96.875% <= 2.295 milliseconds (cumulative count 97010) 207s 98.438% <= 2.375 milliseconds (cumulative count 98490) 207s 99.219% <= 2.447 milliseconds (cumulative count 99290) 207s 99.609% <= 2.543 milliseconds (cumulative count 99620) 207s 99.805% <= 2.695 milliseconds (cumulative count 99810) 207s 99.902% <= 2.887 milliseconds (cumulative count 99910) 207s 99.951% <= 3.047 milliseconds (cumulative count 99960) 207s 99.976% <= 3.119 milliseconds (cumulative count 99980) 207s 99.988% <= 3.151 milliseconds (cumulative count 99990) 207s 99.994% <= 3.183 milliseconds (cumulative count 100000) 207s 100.000% <= 3.183 milliseconds (cumulative count 100000) 207s 207s Cumulative distribution of latencies: 207s 0.000% <= 0.103 milliseconds (cumulative count 0) 207s 0.050% <= 0.703 milliseconds (cumulative count 50) 207s 0.200% <= 0.807 milliseconds (cumulative count 200) 207s 0.420% <= 0.903 milliseconds (cumulative count 420) 207s 0.940% <= 1.007 milliseconds (cumulative count 940) 207s 1.990% <= 1.103 milliseconds (cumulative count 1990) 207s 5.050% <= 1.207 milliseconds (cumulative count 5050) 207s 10.990% <= 1.303 milliseconds (cumulative count 10990) 207s 20.900% <= 1.407 milliseconds (cumulative count 20900) 207s 31.640% <= 1.503 milliseconds (cumulative count 31640) 207s 43.200% <= 1.607 milliseconds (cumulative count 43200) 207s 53.940% <= 1.703 milliseconds (cumulative count 53940) 207s 64.310% <= 1.807 milliseconds (cumulative count 64310) 207s 72.980% <= 1.903 milliseconds (cumulative count 72980) 207s 81.580% <= 2.007 milliseconds (cumulative count 81580) 207s 88.240% <= 2.103 milliseconds (cumulative count 88240) 207s 99.970% <= 3.103 milliseconds (cumulative count 99970) 207s 100.000% <= 4.103 milliseconds (cumulative count 100000) 207s 207s Summary: 207s throughput summary: 261780.11 requests per second 207s latency summary (msec): 207s avg min p50 p95 p99 max 207s 1.691 0.664 1.671 2.239 2.415 3.183 208s LRANGE_100 (first 100 elements): rps=10158.1 (overall: 71388.9) avg_msec=4.059 (overall: 4.059) LRANGE_100 (first 100 elements): rps=74071.1 (overall: 73737.0) avg_msec=3.394 (overall: 3.474) LRANGE_100 (first 100 elements): rps=74404.8 (overall: 74048.1) avg_msec=3.380 (overall: 3.430) LRANGE_100 (first 100 elements): rps=74661.4 (overall: 74242.4) avg_msec=3.383 (overall: 3.415) LRANGE_100 (first 100 elements): rps=75119.0 (overall: 74454.0) avg_msec=3.356 (overall: 3.401) LRANGE_100 (first 100 elements): rps=74901.2 (overall: 74541.2) avg_msec=3.366 (overall: 3.394) ====== LRANGE_100 (first 100 elements) ====== 208s 100000 requests completed in 1.34 seconds 208s 50 parallel clients 208s 3 bytes payload 208s keep alive: 1 208s host configuration "save": 3600 1 300 100 60 10000 208s host configuration "appendonly": no 208s multi-thread: no 208s 208s Latency by percentile distribution: 208s 0.000% <= 0.743 milliseconds (cumulative count 10) 208s 50.000% <= 3.359 milliseconds (cumulative count 50930) 208s 75.000% <= 3.447 milliseconds (cumulative count 75330) 208s 87.500% <= 3.527 milliseconds (cumulative count 87590) 208s 93.750% <= 3.631 milliseconds (cumulative count 93790) 208s 96.875% <= 3.783 milliseconds (cumulative count 96960) 208s 98.438% <= 3.959 milliseconds (cumulative count 98440) 208s 99.219% <= 4.399 milliseconds (cumulative count 99220) 208s 99.609% <= 5.327 milliseconds (cumulative count 99610) 208s 99.805% <= 6.583 milliseconds (cumulative count 99810) 208s 99.902% <= 7.567 milliseconds (cumulative count 99910) 208s 99.951% <= 8.303 milliseconds (cumulative count 99960) 208s 99.976% <= 8.535 milliseconds (cumulative count 99980) 208s 99.988% <= 8.663 milliseconds (cumulative count 99990) 208s 99.994% <= 8.799 milliseconds (cumulative count 100000) 208s 100.000% <= 8.799 milliseconds (cumulative count 100000) 208s 208s Cumulative distribution of latencies: 208s 0.000% <= 0.103 milliseconds (cumulative count 0) 208s 0.010% <= 0.807 milliseconds (cumulative count 10) 208s 0.020% <= 1.903 milliseconds (cumulative count 20) 208s 0.030% <= 2.103 milliseconds (cumulative count 30) 208s 1.220% <= 3.103 milliseconds (cumulative count 1220) 208s 98.740% <= 4.103 milliseconds (cumulative count 98740) 208s 99.580% <= 5.103 milliseconds (cumulative count 99580) 208s 99.730% <= 6.103 milliseconds (cumulative count 99730) 208s 99.870% <= 7.103 milliseconds (cumulative count 99870) 208s 99.940% <= 8.103 milliseconds (cumulative count 99940) 208s 100.000% <= 9.103 milliseconds (cumulative count 100000) 208s 208s Summary: 208s throughput summary: 74571.22 requests per second 208s latency summary (msec): 208s avg min p50 p95 p99 max 208s 3.395 0.736 3.359 3.687 4.199 8.799 213s LRANGE_300 (first 300 elements): rps=17553.8 (overall: 21598.0) avg_msec=11.997 (overall: 11.997) LRANGE_300 (first 300 elements): rps=23678.4 (overall: 22753.8) avg_msec=10.613 (overall: 11.197) LRANGE_300 (first 300 elements): rps=23330.7 (overall: 22957.7) avg_msec=10.686 (overall: 11.013) LRANGE_300 (first 300 elements): rps=23072.0 (overall: 22987.5) avg_msec=11.148 (overall: 11.048) LRANGE_300 (first 300 elements): rps=23792.8 (overall: 23154.4) avg_msec=10.311 (overall: 10.891) LRANGE_300 (first 300 elements): rps=23780.4 (overall: 23263.3) avg_msec=10.022 (overall: 10.737) LRANGE_300 (first 300 elements): rps=23614.2 (overall: 23315.1) avg_msec=10.242 (overall: 10.663) LRANGE_300 (first 300 elements): rps=23628.5 (overall: 23355.3) avg_msec=10.618 (overall: 10.657) LRANGE_300 (first 300 elements): rps=22533.3 (overall: 23261.2) avg_msec=11.203 (overall: 10.717) LRANGE_300 (first 300 elements): rps=22857.1 (overall: 23220.2) avg_msec=11.447 (overall: 10.790) LRANGE_300 (first 300 elements): rps=23830.1 (overall: 23277.8) avg_msec=10.115 (overall: 10.725) LRANGE_300 (first 300 elements): rps=23208.0 (overall: 23272.0) avg_msec=10.922 (overall: 10.742) LRANGE_300 (first 300 elements): rps=21336.0 (overall: 23120.9) avg_msec=12.998 (overall: 10.904) LRANGE_300 (first 300 elements): rps=23798.5 (overall: 23170.9) avg_msec=10.069 (overall: 10.841) LRANGE_300 (first 300 elements): rps=23637.8 (overall: 23202.5) avg_msec=10.838 (overall: 10.841) LRANGE_300 (first 300 elements): rps=24365.1 (overall: 23275.6) avg_msec=9.895 (overall: 10.778) LRANGE_300 (first 300 elements): rps=24400.0 (overall: 23342.9) avg_msec=10.035 (overall: 10.732) ====== LRANGE_300 (first 300 elements) ====== 213s 100000 requests completed in 4.28 seconds 213s 50 parallel clients 213s 3 bytes payload 213s keep alive: 1 213s host configuration "save": 3600 1 300 100 60 10000 213s host configuration "appendonly": no 213s multi-thread: no 213s 213s Latency by percentile distribution: 213s 0.000% <= 0.735 milliseconds (cumulative count 10) 213s 50.000% <= 10.327 milliseconds (cumulative count 50030) 213s 75.000% <= 12.447 milliseconds (cumulative count 75060) 213s 87.500% <= 14.639 milliseconds (cumulative count 87500) 213s 93.750% <= 16.799 milliseconds (cumulative count 93790) 213s 96.875% <= 18.591 milliseconds (cumulative count 96880) 213s 98.438% <= 20.767 milliseconds (cumulative count 98440) 213s 99.219% <= 22.255 milliseconds (cumulative count 99220) 213s 99.609% <= 23.167 milliseconds (cumulative count 99610) 213s 99.805% <= 24.239 milliseconds (cumulative count 99810) 213s 99.902% <= 29.135 milliseconds (cumulative count 99910) 213s 99.951% <= 30.095 milliseconds (cumulative count 99960) 213s 99.976% <= 30.623 milliseconds (cumulative count 99980) 213s 99.988% <= 30.815 milliseconds (cumulative count 99990) 213s 99.994% <= 31.071 milliseconds (cumulative count 100000) 213s 100.000% <= 31.071 milliseconds (cumulative count 100000) 213s 213s Cumulative distribution of latencies: 213s 0.000% <= 0.103 milliseconds (cumulative count 0) 213s 0.010% <= 0.807 milliseconds (cumulative count 10) 213s 0.030% <= 1.007 milliseconds (cumulative count 30) 213s 0.040% <= 1.103 milliseconds (cumulative count 40) 213s 0.050% <= 1.207 milliseconds (cumulative count 50) 213s 0.060% <= 1.303 milliseconds (cumulative count 60) 213s 0.080% <= 1.407 milliseconds (cumulative count 80) 213s 0.130% <= 1.503 milliseconds (cumulative count 130) 213s 0.140% <= 1.607 milliseconds (cumulative count 140) 213s 0.170% <= 1.703 milliseconds (cumulative count 170) 213s 0.220% <= 1.807 milliseconds (cumulative count 220) 213s 0.250% <= 1.903 milliseconds (cumulative count 250) 213s 0.300% <= 2.007 milliseconds (cumulative count 300) 213s 0.340% <= 2.103 milliseconds (cumulative count 340) 213s 0.510% <= 3.103 milliseconds (cumulative count 510) 213s 1.110% <= 4.103 milliseconds (cumulative count 1110) 213s 2.800% <= 5.103 milliseconds (cumulative count 2800) 213s 6.590% <= 6.103 milliseconds (cumulative count 6590) 213s 12.640% <= 7.103 milliseconds (cumulative count 12640) 213s 21.820% <= 8.103 milliseconds (cumulative count 21820) 213s 33.940% <= 9.103 milliseconds (cumulative count 33940) 213s 46.970% <= 10.103 milliseconds (cumulative count 46970) 213s 60.590% <= 11.103 milliseconds (cumulative count 60590) 213s 71.920% <= 12.103 milliseconds (cumulative count 71920) 213s 79.880% <= 13.103 milliseconds (cumulative count 79880) 213s 85.380% <= 14.103 milliseconds (cumulative count 85380) 213s 89.120% <= 15.103 milliseconds (cumulative count 89120) 213s 92.100% <= 16.103 milliseconds (cumulative count 92100) 213s 94.380% <= 17.103 milliseconds (cumulative count 94380) 213s 96.230% <= 18.111 milliseconds (cumulative count 96230) 213s 97.360% <= 19.103 milliseconds (cumulative count 97360) 213s 98.030% <= 20.111 milliseconds (cumulative count 98030) 213s 98.650% <= 21.103 milliseconds (cumulative count 98650) 213s 99.150% <= 22.111 milliseconds (cumulative count 99150) 213s 99.570% <= 23.103 milliseconds (cumulative count 99570) 213s 99.790% <= 24.111 milliseconds (cumulative count 99790) 213s 99.890% <= 25.103 milliseconds (cumulative count 99890) 213s 99.900% <= 29.103 milliseconds (cumulative count 99900) 213s 99.960% <= 30.111 milliseconds (cumulative count 99960) 213s 100.000% <= 31.103 milliseconds (cumulative count 100000) 213s 213s Summary: 213s throughput summary: 23348.12 requests per second 213s latency summary (msec): 213s avg min p50 p95 p99 max 213s 10.723 0.728 10.327 17.391 21.823 31.071 220s LRANGE_500 (first 500 elements): rps=9127.0 (overall: 10132.2) avg_msec=24.293 (overall: 24.293) LRANGE_500 (first 500 elements): rps=13426.9 (overall: 11868.8) avg_msec=18.036 (overall: 20.562) LRANGE_500 (first 500 elements): rps=13570.3 (overall: 12460.6) avg_msec=17.494 (overall: 19.400) LRANGE_500 (first 500 elements): rps=14070.3 (overall: 12876.0) avg_msec=14.017 (overall: 17.882) LRANGE_500 (first 500 elements): rps=14135.5 (overall: 13130.3) avg_msec=13.866 (overall: 17.009) LRANGE_500 (first 500 elements): rps=14096.9 (overall: 13296.5) avg_msec=14.793 (overall: 16.605) LRANGE_500 (first 500 elements): rps=14165.4 (overall: 13422.2) avg_msec=13.710 (overall: 16.163) LRANGE_500 (first 500 elements): rps=14170.0 (overall: 13516.4) avg_msec=14.251 (overall: 15.910) LRANGE_500 (first 500 elements): rps=13876.5 (overall: 13556.4) avg_msec=14.850 (overall: 15.790) LRANGE_500 (first 500 elements): rps=14007.8 (overall: 13602.2) avg_msec=15.751 (overall: 15.786) LRANGE_500 (first 500 elements): rps=14171.3 (overall: 13653.9) avg_msec=14.186 (overall: 15.635) LRANGE_500 (first 500 elements): rps=14244.1 (overall: 13703.5) avg_msec=14.456 (overall: 15.532) LRANGE_500 (first 500 elements): rps=14154.2 (overall: 13738.4) avg_msec=14.172 (overall: 15.423) LRANGE_500 (first 500 elements): rps=13660.2 (overall: 13732.7) avg_msec=16.454 (overall: 15.499) LRANGE_500 (first 500 elements): rps=13511.9 (overall: 13717.9) avg_msec=17.646 (overall: 15.640) LRANGE_500 (first 500 elements): rps=14316.4 (overall: 13755.9) avg_msec=14.919 (overall: 15.592) LRANGE_500 (first 500 elements): rps=14055.1 (overall: 13773.6) avg_msec=14.367 (overall: 15.518) LRANGE_500 (first 500 elements): rps=14231.1 (overall: 13798.9) avg_msec=14.495 (overall: 15.460) LRANGE_500 (first 500 elements): rps=14019.3 (overall: 13810.7) avg_msec=14.832 (overall: 15.425) LRANGE_500 (first 500 elements): rps=13869.0 (overall: 13813.7) avg_msec=15.610 (overall: 15.435) LRANGE_500 (first 500 elements): rps=14182.5 (overall: 13831.2) avg_msec=14.093 (overall: 15.369) LRANGE_500 (first 500 elements): rps=14085.9 (overall: 13842.9) avg_msec=13.639 (overall: 15.288) LRANGE_500 (first 500 elements): rps=14195.3 (overall: 13858.4) avg_msec=14.545 (overall: 15.255) LRANGE_500 (first 500 elements): rps=14098.0 (overall: 13868.5) avg_msec=14.707 (overall: 15.231) LRANGE_500 (first 500 elements): rps=13821.4 (overall: 13866.6) avg_msec=15.937 (overall: 15.259) LRANGE_500 (first 500 elements): rps=13515.9 (overall: 13853.1) avg_msec=16.895 (overall: 15.321) LRANGE_500 (first 500 elements): rps=13486.2 (overall: 13839.6) avg_msec=15.585 (overall: 15.330) LRANGE_500 (first 500 elements): rps=12836.0 (overall: 13804.1) avg_msec=19.311 (overall: 15.461) ====== LRANGE_500 (first 500 elements) ====== 220s 100000 requests completed in 7.26 seconds 220s 50 parallel clients 220s 3 bytes payload 220s keep alive: 1 220s host configuration "save": 3600 1 300 100 60 10000 220s host configuration "appendonly": no 220s multi-thread: no 220s 220s Latency by percentile distribution: 220s 0.000% <= 0.639 milliseconds (cumulative count 10) 220s 50.000% <= 14.719 milliseconds (cumulative count 50040) 220s 75.000% <= 18.095 milliseconds (cumulative count 75000) 220s 87.500% <= 21.439 milliseconds (cumulative count 87530) 220s 93.750% <= 25.695 milliseconds (cumulative count 93750) 220s 96.875% <= 29.087 milliseconds (cumulative count 96910) 220s 98.438% <= 31.727 milliseconds (cumulative count 98450) 220s 99.219% <= 33.247 milliseconds (cumulative count 99220) 220s 99.609% <= 34.367 milliseconds (cumulative count 99610) 220s 99.805% <= 35.871 milliseconds (cumulative count 99810) 220s 99.902% <= 38.367 milliseconds (cumulative count 99910) 220s 99.951% <= 39.103 milliseconds (cumulative count 99970) 220s 99.976% <= 39.327 milliseconds (cumulative count 99980) 220s 99.988% <= 39.423 milliseconds (cumulative count 99990) 220s 99.994% <= 40.447 milliseconds (cumulative count 100000) 220s 100.000% <= 40.447 milliseconds (cumulative count 100000) 220s 220s Cumulative distribution of latencies: 220s 0.000% <= 0.103 milliseconds (cumulative count 0) 220s 0.010% <= 0.703 milliseconds (cumulative count 10) 220s 0.020% <= 2.103 milliseconds (cumulative count 20) 220s 0.110% <= 3.103 milliseconds (cumulative count 110) 220s 0.260% <= 4.103 milliseconds (cumulative count 260) 220s 0.680% <= 5.103 milliseconds (cumulative count 680) 220s 1.440% <= 6.103 milliseconds (cumulative count 1440) 220s 2.620% <= 7.103 milliseconds (cumulative count 2620) 220s 4.540% <= 8.103 milliseconds (cumulative count 4540) 220s 7.260% <= 9.103 milliseconds (cumulative count 7260) 220s 11.810% <= 10.103 milliseconds (cumulative count 11810) 220s 18.890% <= 11.103 milliseconds (cumulative count 18890) 220s 26.970% <= 12.103 milliseconds (cumulative count 26970) 220s 35.980% <= 13.103 milliseconds (cumulative count 35980) 220s 44.970% <= 14.103 milliseconds (cumulative count 44970) 220s 53.230% <= 15.103 milliseconds (cumulative count 53230) 220s 60.870% <= 16.103 milliseconds (cumulative count 60870) 220s 68.170% <= 17.103 milliseconds (cumulative count 68170) 220s 75.090% <= 18.111 milliseconds (cumulative count 75090) 220s 80.490% <= 19.103 milliseconds (cumulative count 80490) 220s 84.330% <= 20.111 milliseconds (cumulative count 84330) 220s 86.870% <= 21.103 milliseconds (cumulative count 86870) 220s 88.720% <= 22.111 milliseconds (cumulative count 88720) 220s 90.330% <= 23.103 milliseconds (cumulative count 90330) 220s 91.750% <= 24.111 milliseconds (cumulative count 91750) 220s 93.050% <= 25.103 milliseconds (cumulative count 93050) 220s 94.240% <= 26.111 milliseconds (cumulative count 94240) 220s 95.230% <= 27.103 milliseconds (cumulative count 95230) 220s 96.180% <= 28.111 milliseconds (cumulative count 96180) 220s 96.920% <= 29.103 milliseconds (cumulative count 96920) 220s 97.440% <= 30.111 milliseconds (cumulative count 97440) 220s 98.000% <= 31.103 milliseconds (cumulative count 98000) 220s 98.640% <= 32.111 milliseconds (cumulative count 98640) 220s 99.150% <= 33.119 milliseconds (cumulative count 99150) 220s 99.520% <= 34.111 milliseconds (cumulative count 99520) 220s 99.730% <= 35.103 milliseconds (cumulative count 99730) 220s 99.820% <= 36.127 milliseconds (cumulative count 99820) 220s 99.880% <= 37.119 milliseconds (cumulative count 99880) 220s 99.900% <= 38.111 milliseconds (cumulative count 99900) 220s 99.970% <= 39.103 milliseconds (cumulative count 99970) 220s 99.990% <= 40.127 milliseconds (cumulative count 99990) 220s 100.000% <= 41.119 milliseconds (cumulative count 100000) 220s 220s Summary: 220s throughput summary: 13766.52 requests per second 220s latency summary (msec): 220s avg min p50 p95 p99 max 220s 15.574 0.632 14.719 26.863 32.751 40.447 229s LRANGE_600 (first 600 elements): rps=2198.4 (overall: 8393.9) avg_msec=30.058 (overall: 30.058) LRANGE_600 (first 600 elements): rps=10539.7 (overall: 10094.3) avg_msec=21.011 (overall: 22.572) LRANGE_600 (first 600 elements): rps=10529.6 (overall: 10287.2) avg_msec=23.415 (overall: 22.954) LRANGE_600 (first 600 elements): rps=10732.3 (overall: 10424.2) avg_msec=19.473 (overall: 21.851) LRANGE_600 (first 600 elements): rps=11378.0 (overall: 10648.7) avg_msec=17.093 (overall: 20.654) LRANGE_600 (first 600 elements): rps=10131.0 (overall: 10550.7) avg_msec=24.467 (overall: 21.347) LRANGE_600 (first 600 elements): rps=9708.0 (overall: 10417.5) avg_msec=26.256 (overall: 22.071) LRANGE_600 (first 600 elements): rps=10478.1 (overall: 10425.8) avg_msec=20.931 (overall: 21.914) LRANGE_600 (first 600 elements): rps=10223.1 (overall: 10401.3) avg_msec=22.093 (overall: 21.935) LRANGE_600 (first 600 elements): rps=11135.5 (overall: 10480.3) avg_msec=19.457 (overall: 21.652) LRANGE_600 (first 600 elements): rps=11965.0 (overall: 10627.6) avg_msec=16.608 (overall: 21.089) LRANGE_600 (first 600 elements): rps=11980.3 (overall: 10748.3) avg_msec=14.623 (overall: 20.445) LRANGE_600 (first 600 elements): rps=11669.3 (overall: 10823.0) avg_msec=15.997 (overall: 20.056) LRANGE_600 (first 600 elements): rps=10680.8 (overall: 10812.0) avg_msec=20.684 (overall: 20.104) LRANGE_600 (first 600 elements): rps=10758.9 (overall: 10808.3) avg_msec=22.404 (overall: 20.265) LRANGE_600 (first 600 elements): rps=11114.6 (overall: 10828.3) avg_msec=20.002 (overall: 20.247) LRANGE_600 (first 600 elements): rps=10036.0 (overall: 10780.2) avg_msec=23.266 (overall: 20.418) LRANGE_600 (first 600 elements): rps=8645.4 (overall: 10657.3) avg_msec=28.896 (overall: 20.814) LRANGE_600 (first 600 elements): rps=8609.6 (overall: 10545.9) avg_msec=28.865 (overall: 21.171) LRANGE_600 (first 600 elements): rps=8964.4 (overall: 10463.7) avg_msec=27.364 (overall: 21.447) LRANGE_600 (first 600 elements): rps=11034.9 (overall: 10492.5) avg_msec=18.835 (overall: 21.309) LRANGE_600 (first 600 elements): rps=10604.7 (overall: 10497.9) avg_msec=23.000 (overall: 21.391) LRANGE_600 (first 600 elements): rps=10716.5 (overall: 10507.7) avg_msec=21.459 (overall: 21.394) LRANGE_600 (first 600 elements): rps=10070.6 (overall: 10488.8) avg_msec=24.547 (overall: 21.525) LRANGE_600 (first 600 elements): rps=9783.5 (overall: 10459.6) avg_msec=23.644 (overall: 21.607) LRANGE_600 (first 600 elements): rps=11446.2 (overall: 10498.4) avg_msec=19.978 (overall: 21.537) LRANGE_600 (first 600 elements): rps=11309.8 (overall: 10529.5) avg_msec=17.828 (overall: 21.384) LRANGE_600 (first 600 elements): rps=11418.0 (overall: 10562.4) avg_msec=19.321 (overall: 21.302) LRANGE_600 (first 600 elements): rps=10795.3 (overall: 10570.7) avg_msec=20.029 (overall: 21.255) LRANGE_600 (first 600 elements): rps=9920.3 (overall: 10548.6) avg_msec=23.906 (overall: 21.340) LRANGE_600 (first 600 elements): rps=9858.8 (overall: 10525.7) avg_msec=25.985 (overall: 21.485) LRANGE_600 (first 600 elements): rps=11498.0 (overall: 10556.7) avg_msec=16.045 (overall: 21.295) LRANGE_600 (first 600 elements): rps=11920.9 (overall: 10599.0) avg_msec=13.760 (overall: 21.033) LRANGE_600 (first 600 elements): rps=11502.0 (overall: 10626.3) avg_msec=15.618 (overall: 20.856) LRANGE_600 (first 600 elements): rps=9442.2 (overall: 10592.1) avg_msec=25.732 (overall: 20.981) LRANGE_600 (first 600 elements): rps=9595.3 (overall: 10563.4) avg_msec=26.289 (overall: 21.120) LRANGE_600 (first 600 elements): rps=11586.6 (overall: 10591.7) avg_msec=18.678 (overall: 21.046) LRANGE_600 (first 600 elements): rps=10063.7 (overall: 10577.6) avg_msec=24.508 (overall: 21.134) ====== LRANGE_600 (first 600 elements) ====== 229s 100000 requests completed in 9.45 seconds 229s 50 parallel clients 229s 3 bytes payload 229s keep alive: 1 229s host configuration "save": 3600 1 300 100 60 10000 229s host configuration "appendonly": no 229s multi-thread: no 229s 229s Latency by percentile distribution: 229s 0.000% <= 1.527 milliseconds (cumulative count 10) 229s 50.000% <= 19.727 milliseconds (cumulative count 50060) 229s 75.000% <= 28.655 milliseconds (cumulative count 75040) 229s 87.500% <= 33.919 milliseconds (cumulative count 87530) 229s 93.750% <= 36.287 milliseconds (cumulative count 93790) 229s 96.875% <= 38.111 milliseconds (cumulative count 96890) 229s 98.438% <= 39.551 milliseconds (cumulative count 98440) 229s 99.219% <= 40.863 milliseconds (cumulative count 99250) 229s 99.609% <= 42.367 milliseconds (cumulative count 99610) 229s 99.805% <= 44.223 milliseconds (cumulative count 99810) 229s 99.902% <= 45.535 milliseconds (cumulative count 99910) 229s 99.951% <= 46.399 milliseconds (cumulative count 99960) 229s 99.976% <= 46.911 milliseconds (cumulative count 99980) 229s 99.988% <= 48.063 milliseconds (cumulative count 99990) 229s 99.994% <= 48.319 milliseconds (cumulative count 100000) 229s 100.000% <= 48.319 milliseconds (cumulative count 100000) 229s 229s Cumulative distribution of latencies: 229s 0.000% <= 0.103 milliseconds (cumulative count 0) 229s 0.010% <= 1.607 milliseconds (cumulative count 10) 229s 0.050% <= 1.703 milliseconds (cumulative count 50) 229s 0.090% <= 1.807 milliseconds (cumulative count 90) 229s 0.120% <= 1.903 milliseconds (cumulative count 120) 229s 0.210% <= 2.007 milliseconds (cumulative count 210) 229s 0.290% <= 2.103 milliseconds (cumulative count 290) 229s 1.310% <= 3.103 milliseconds (cumulative count 1310) 229s 1.860% <= 4.103 milliseconds (cumulative count 1860) 229s 2.520% <= 5.103 milliseconds (cumulative count 2520) 229s 3.660% <= 6.103 milliseconds (cumulative count 3660) 229s 4.820% <= 7.103 milliseconds (cumulative count 4820) 229s 6.200% <= 8.103 milliseconds (cumulative count 6200) 229s 7.920% <= 9.103 milliseconds (cumulative count 7920) 229s 10.280% <= 10.103 milliseconds (cumulative count 10280) 229s 13.610% <= 11.103 milliseconds (cumulative count 13610) 229s 17.510% <= 12.103 milliseconds (cumulative count 17510) 229s 22.150% <= 13.103 milliseconds (cumulative count 22150) 229s 26.800% <= 14.103 milliseconds (cumulative count 26800) 229s 31.650% <= 15.103 milliseconds (cumulative count 31650) 229s 35.880% <= 16.103 milliseconds (cumulative count 35880) 229s 39.650% <= 17.103 milliseconds (cumulative count 39650) 229s 43.730% <= 18.111 milliseconds (cumulative count 43730) 229s 47.680% <= 19.103 milliseconds (cumulative count 47680) 229s 51.440% <= 20.111 milliseconds (cumulative count 51440) 229s 54.530% <= 21.103 milliseconds (cumulative count 54530) 229s 57.710% <= 22.111 milliseconds (cumulative count 57710) 229s 60.330% <= 23.103 milliseconds (cumulative count 60330) 229s 63.530% <= 24.111 milliseconds (cumulative count 63530) 229s 66.570% <= 25.103 milliseconds (cumulative count 66570) 229s 69.220% <= 26.111 milliseconds (cumulative count 69220) 229s 71.440% <= 27.103 milliseconds (cumulative count 71440) 229s 73.770% <= 28.111 milliseconds (cumulative count 73770) 229s 76.140% <= 29.103 milliseconds (cumulative count 76140) 229s 78.430% <= 30.111 milliseconds (cumulative count 78430) 229s 80.480% <= 31.103 milliseconds (cumulative count 80480) 229s 82.530% <= 32.111 milliseconds (cumulative count 82530) 229s 85.200% <= 33.119 milliseconds (cumulative count 85200) 229s 88.040% <= 34.111 milliseconds (cumulative count 88040) 229s 90.780% <= 35.103 milliseconds (cumulative count 90780) 229s 93.420% <= 36.127 milliseconds (cumulative count 93420) 229s 95.400% <= 37.119 milliseconds (cumulative count 95400) 229s 96.890% <= 38.111 milliseconds (cumulative count 96890) 229s 98.120% <= 39.103 milliseconds (cumulative count 98120) 229s 98.840% <= 40.127 milliseconds (cumulative count 98840) 229s 99.340% <= 41.119 milliseconds (cumulative count 99340) 229s 99.590% <= 42.111 milliseconds (cumulative count 99590) 229s 99.650% <= 43.103 milliseconds (cumulative count 99650) 229s 99.790% <= 44.127 milliseconds (cumulative count 99790) 229s 99.890% <= 45.119 milliseconds (cumulative count 99890) 229s 99.940% <= 46.111 milliseconds (cumulative count 99940) 229s 99.980% <= 47.103 milliseconds (cumulative count 99980) 229s 99.990% <= 48.127 milliseconds (cumulative count 99990) 229s 100.000% <= 49.119 milliseconds (cumulative count 100000) 229s 229s Summary: 229s throughput summary: 10578.65 requests per second 229s latency summary (msec): 229s avg min p50 p95 p99 max 229s 21.122 1.520 19.727 36.895 40.415 48.319 230s MSET (10 keys): rps=116600.0 (overall: 123516.9) avg_msec=3.800 (overall: 3.800) MSET (10 keys): rps=119920.3 (overall: 121663.2) avg_msec=3.937 (overall: 3.870) MSET (10 keys): rps=121753.0 (overall: 121693.8) avg_msec=3.865 (overall: 3.868) ====== MSET (10 keys) ====== 230s 100000 requests completed in 0.82 seconds 230s 50 parallel clients 230s 3 bytes payload 230s keep alive: 1 230s host configuration "save": 3600 1 300 100 60 10000 230s host configuration "appendonly": no 230s multi-thread: no 230s 230s Latency by percentile distribution: 230s 0.000% <= 0.799 milliseconds (cumulative count 10) 230s 50.000% <= 3.863 milliseconds (cumulative count 50110) 230s 75.000% <= 4.111 milliseconds (cumulative count 75630) 230s 87.500% <= 4.263 milliseconds (cumulative count 87840) 230s 93.750% <= 4.375 milliseconds (cumulative count 93880) 230s 96.875% <= 4.471 milliseconds (cumulative count 96980) 230s 98.438% <= 4.551 milliseconds (cumulative count 98470) 230s 99.219% <= 4.631 milliseconds (cumulative count 99280) 230s 99.609% <= 4.711 milliseconds (cumulative count 99620) 230s 99.805% <= 4.807 milliseconds (cumulative count 99830) 230s 99.902% <= 4.887 milliseconds (cumulative count 99910) 230s 99.951% <= 4.999 milliseconds (cumulative count 99960) 230s 99.976% <= 5.015 milliseconds (cumulative count 99980) 230s 99.988% <= 5.079 milliseconds (cumulative count 99990) 230s 99.994% <= 5.191 milliseconds (cumulative count 100000) 230s 100.000% <= 5.191 milliseconds (cumulative count 100000) 230s 230s Cumulative distribution of latencies: 230s 0.000% <= 0.103 milliseconds (cumulative count 0) 230s 0.010% <= 0.807 milliseconds (cumulative count 10) 230s 0.020% <= 1.903 milliseconds (cumulative count 20) 230s 0.040% <= 2.007 milliseconds (cumulative count 40) 230s 0.100% <= 2.103 milliseconds (cumulative count 100) 230s 2.030% <= 3.103 milliseconds (cumulative count 2030) 230s 74.810% <= 4.103 milliseconds (cumulative count 74810) 230s 99.990% <= 5.103 milliseconds (cumulative count 99990) 230s 100.000% <= 6.103 milliseconds (cumulative count 100000) 230s 230s Summary: 230s throughput summary: 121951.22 requests per second 230s latency summary (msec): 230s avg min p50 p95 p99 max 230s 3.866 0.792 3.863 4.407 4.599 5.191 231s XADD: rps=146772.9 (overall: 221927.7) avg_msec=2.028 (overall: 2.028) XADD: rps=227689.2 (overall: 225395.7) avg_msec=1.991 (overall: 2.006) ====== XADD ====== 231s 100000 requests completed in 0.44 seconds 231s 50 parallel clients 231s 3 bytes payload 231s keep alive: 1 231s host configuration "save": 3600 1 300 100 60 10000 231s host configuration "appendonly": no 231s multi-thread: no 231s 231s Latency by percentile distribution: 231s 0.000% <= 0.551 milliseconds (cumulative count 10) 231s 50.000% <= 2.007 milliseconds (cumulative count 50670) 231s 75.000% <= 2.239 milliseconds (cumulative count 75550) 231s 87.500% <= 2.383 milliseconds (cumulative count 87880) 231s 93.750% <= 2.479 milliseconds (cumulative count 93830) 231s 96.875% <= 2.567 milliseconds (cumulative count 97040) 231s 98.438% <= 2.631 milliseconds (cumulative count 98440) 231s 99.219% <= 2.687 milliseconds (cumulative count 99220) 231s 99.609% <= 2.743 milliseconds (cumulative count 99660) 231s 99.805% <= 2.783 milliseconds (cumulative count 99820) 231s 99.902% <= 2.815 milliseconds (cumulative count 99910) 231s 99.951% <= 2.887 milliseconds (cumulative count 99970) 231s 99.976% <= 2.903 milliseconds (cumulative count 99980) 231s 99.988% <= 2.935 milliseconds (cumulative count 100000) 231s 100.000% <= 2.935 milliseconds (cumulative count 100000) 231s 231s Cumulative distribution of latencies: 231s 0.000% <= 0.103 milliseconds (cumulative count 0) 231s 0.010% <= 0.607 milliseconds (cumulative count 10) 231s 0.020% <= 0.903 milliseconds (cumulative count 20) 231s 0.090% <= 1.007 milliseconds (cumulative count 90) 231s 0.230% <= 1.103 milliseconds (cumulative count 230) 231s 0.620% <= 1.207 milliseconds (cumulative count 620) 231s 1.740% <= 1.303 milliseconds (cumulative count 1740) 231s 3.420% <= 1.407 milliseconds (cumulative count 3420) 231s 5.570% <= 1.503 milliseconds (cumulative count 5570) 231s 9.870% <= 1.607 milliseconds (cumulative count 9870) 231s 16.700% <= 1.703 milliseconds (cumulative count 16700) 231s 27.460% <= 1.807 milliseconds (cumulative count 27460) 231s 38.260% <= 1.903 milliseconds (cumulative count 38260) 231s 50.670% <= 2.007 milliseconds (cumulative count 50670) 231s 61.760% <= 2.103 milliseconds (cumulative count 61760) 231s 100.000% <= 3.103 milliseconds (cumulative count 100000) 231s 231s Summary: 231s throughput summary: 226244.34 requests per second 231s latency summary (msec): 231s avg min p50 p95 p99 max 231s 2.004 0.544 2.007 2.511 2.671 2.935 236s FUNCTION LOAD: rps=16733.1 (overall: 18834.1) avg_msec=25.091 (overall: 25.091) FUNCTION LOAD: rps=19240.0 (overall: 19048.6) avg_msec=25.835 (overall: 25.488) FUNCTION LOAD: rps=17928.3 (overall: 18660.2) avg_msec=26.631 (overall: 25.869) FUNCTION LOAD: rps=18690.5 (overall: 18668.0) avg_msec=26.170 (overall: 25.947) FUNCTION LOAD: rps=19203.2 (overall: 18777.5) avg_msec=26.493 (overall: 26.061) FUNCTION LOAD: rps=18764.9 (overall: 18775.4) avg_msec=26.343 (overall: 26.109) FUNCTION LOAD: rps=18360.0 (overall: 18715.3) avg_msec=26.284 (overall: 26.134) FUNCTION LOAD: rps=19043.8 (overall: 18756.9) avg_msec=26.252 (overall: 26.149) FUNCTION LOAD: rps=18640.0 (overall: 18743.8) avg_msec=26.009 (overall: 26.133) FUNCTION LOAD: rps=19127.0 (overall: 18782.8) avg_msec=26.091 (overall: 26.129) FUNCTION LOAD: rps=19681.3 (overall: 18865.3) avg_msec=25.886 (overall: 26.106) FUNCTION LOAD: rps=19203.2 (overall: 18893.7) avg_msec=25.844 (overall: 26.083) FUNCTION LOAD: rps=18964.1 (overall: 18899.2) avg_msec=26.027 (overall: 26.079) FUNCTION LOAD: rps=17928.3 (overall: 18829.3) avg_msec=26.419 (overall: 26.102) FUNCTION LOAD: rps=19881.0 (overall: 18900.2) avg_msec=25.799 (overall: 26.081) FUNCTION LOAD: rps=18924.3 (overall: 18901.7) avg_msec=25.869 (overall: 26.067) FUNCTION LOAD: rps=19043.8 (overall: 18910.1) avg_msec=26.151 (overall: 26.072) FUNCTION LOAD: rps=19004.0 (overall: 18915.4) avg_msec=26.120 (overall: 26.075) FUNCTION LOAD: rps=18964.1 (overall: 18917.9) avg_msec=26.390 (overall: 26.092) FUNCTION LOAD: rps=18008.0 (overall: 18872.2) avg_msec=26.359 (overall: 26.104) FUNCTION LOAD: rps=19282.9 (overall: 18891.9) avg_msec=26.725 (overall: 26.135) ====== FUNCTION LOAD ====== 236s 100000 requests completed in 5.29 seconds 236s 50 parallel clients 236s 3 bytes payload 236s keep alive: 1 236s host configuration "save": 3600 1 300 100 60 10000 236s host configuration "appendonly": no 236s multi-thread: no 236s 236s Latency by percentile distribution: 236s 0.000% <= 0.967 milliseconds (cumulative count 10) 236s 50.000% <= 26.159 milliseconds (cumulative count 50720) 236s 75.000% <= 26.575 milliseconds (cumulative count 75220) 236s 87.500% <= 26.943 milliseconds (cumulative count 87780) 236s 93.750% <= 27.279 milliseconds (cumulative count 93820) 236s 96.875% <= 27.695 milliseconds (cumulative count 96920) 236s 98.438% <= 28.127 milliseconds (cumulative count 98470) 236s 99.219% <= 28.495 milliseconds (cumulative count 99250) 236s 99.609% <= 28.911 milliseconds (cumulative count 99610) 236s 99.805% <= 30.799 milliseconds (cumulative count 99810) 236s 99.902% <= 31.231 milliseconds (cumulative count 99910) 236s 99.951% <= 31.567 milliseconds (cumulative count 99960) 236s 99.976% <= 31.647 milliseconds (cumulative count 99980) 236s 99.988% <= 31.679 milliseconds (cumulative count 99990) 236s 99.994% <= 31.791 milliseconds (cumulative count 100000) 236s 100.000% <= 31.791 milliseconds (cumulative count 100000) 236s 236s Cumulative distribution of latencies: 236s 0.000% <= 0.103 milliseconds (cumulative count 0) 236s 0.010% <= 1.007 milliseconds (cumulative count 10) 236s 0.050% <= 4.103 milliseconds (cumulative count 50) 236s 0.100% <= 6.103 milliseconds (cumulative count 100) 236s 0.130% <= 7.103 milliseconds (cumulative count 130) 236s 0.170% <= 9.103 milliseconds (cumulative count 170) 236s 0.190% <= 10.103 milliseconds (cumulative count 190) 236s 0.250% <= 11.103 milliseconds (cumulative count 250) 236s 0.310% <= 12.103 milliseconds (cumulative count 310) 236s 0.360% <= 13.103 milliseconds (cumulative count 360) 236s 0.410% <= 14.103 milliseconds (cumulative count 410) 236s 0.620% <= 15.103 milliseconds (cumulative count 620) 236s 0.660% <= 16.103 milliseconds (cumulative count 660) 236s 0.670% <= 17.103 milliseconds (cumulative count 670) 236s 0.700% <= 18.111 milliseconds (cumulative count 700) 236s 0.790% <= 20.111 milliseconds (cumulative count 790) 236s 1.050% <= 23.103 milliseconds (cumulative count 1050) 236s 1.900% <= 25.103 milliseconds (cumulative count 1900) 236s 47.540% <= 26.111 milliseconds (cumulative count 47540) 236s 91.090% <= 27.103 milliseconds (cumulative count 91090) 236s 98.420% <= 28.111 milliseconds (cumulative count 98420) 236s 99.700% <= 29.103 milliseconds (cumulative count 99700) 236s 99.730% <= 30.111 milliseconds (cumulative count 99730) 236s 99.880% <= 31.103 milliseconds (cumulative count 99880) 236s 100.000% <= 32.111 milliseconds (cumulative count 100000) 236s 236s Summary: 236s throughput summary: 18896.45 requests per second 236s latency summary (msec): 236s avg min p50 p95 p99 max 236s 26.122 0.960 26.159 27.391 28.399 31.791 236s FCALL: rps=191360.0 (overall: 241616.2) avg_msec=1.862 (overall: 1.862) ====== FCALL ====== 236s 100000 requests completed in 0.41 seconds 236s 50 parallel clients 236s 3 bytes payload 236s keep alive: 1 236s host configuration "save": 3600 1 300 100 60 10000 236s host configuration "appendonly": no 236s multi-thread: no 236s 236s Latency by percentile distribution: 236s 0.000% <= 0.519 milliseconds (cumulative count 10) 236s 50.000% <= 1.839 milliseconds (cumulative count 50820) 236s 75.000% <= 2.055 milliseconds (cumulative count 75760) 236s 87.500% <= 2.191 milliseconds (cumulative count 87890) 236s 93.750% <= 2.295 milliseconds (cumulative count 93950) 236s 96.875% <= 2.391 milliseconds (cumulative count 96920) 236s 98.438% <= 2.567 milliseconds (cumulative count 98470) 236s 99.219% <= 3.287 milliseconds (cumulative count 99220) 236s 99.609% <= 5.887 milliseconds (cumulative count 99610) 236s 99.805% <= 6.375 milliseconds (cumulative count 99810) 236s 99.902% <= 6.551 milliseconds (cumulative count 99910) 236s 99.951% <= 6.839 milliseconds (cumulative count 99960) 236s 99.976% <= 7.007 milliseconds (cumulative count 99980) 236s 99.988% <= 7.039 milliseconds (cumulative count 99990) 236s 99.994% <= 7.047 milliseconds (cumulative count 100000) 236s 100.000% <= 7.047 milliseconds (cumulative count 100000) 236s 236s Cumulative distribution of latencies: 236s 0.000% <= 0.103 milliseconds (cumulative count 0) 236s 0.100% <= 0.607 milliseconds (cumulative count 100) 236s 0.210% <= 0.703 milliseconds (cumulative count 210) 236s 0.340% <= 0.807 milliseconds (cumulative count 340) 236s 0.500% <= 0.903 milliseconds (cumulative count 500) 236s 0.820% <= 1.007 milliseconds (cumulative count 820) 236s 1.280% <= 1.103 milliseconds (cumulative count 1280) 236s 2.400% <= 1.207 milliseconds (cumulative count 2400) 236s 4.330% <= 1.303 milliseconds (cumulative count 4330) 236s 8.140% <= 1.407 milliseconds (cumulative count 8140) 236s 14.490% <= 1.503 milliseconds (cumulative count 14490) 236s 24.670% <= 1.607 milliseconds (cumulative count 24670) 236s 34.880% <= 1.703 milliseconds (cumulative count 34880) 236s 47.110% <= 1.807 milliseconds (cumulative count 47110) 236s 58.290% <= 1.903 milliseconds (cumulative count 58290) 236s 70.630% <= 2.007 milliseconds (cumulative count 70630) 236s 80.310% <= 2.103 milliseconds (cumulative count 80310) 236s 99.090% <= 3.103 milliseconds (cumulative count 99090) 236s 99.500% <= 4.103 milliseconds (cumulative count 99500) 236s 99.690% <= 6.103 milliseconds (cumulative count 99690) 236s 100.000% <= 7.103 milliseconds (cumulative count 100000) 236s 236s Summary: 236s throughput summary: 243902.44 requests per second 236s latency summary (msec): 236s avg min p50 p95 p99 max 236s 1.852 0.512 1.839 2.327 3.023 7.047 236s 237s autopkgtest [08:24:49]: test 0002-benchmark: -----------------------] 240s autopkgtest [08:24:52]: test 0002-benchmark: - - - - - - - - - - results - - - - - - - - - - 240s 0002-benchmark PASS 244s autopkgtest [08:24:56]: test 0003-valkey-check-aof: preparing testbed 246s Reading package lists... 246s Building dependency tree... 246s Reading state information... 247s Solving dependencies... 248s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 255s autopkgtest [08:25:07]: test 0003-valkey-check-aof: [----------------------- 257s autopkgtest [08:25:09]: test 0003-valkey-check-aof: -----------------------] 261s 0003-valkey-check-aof PASS 261s autopkgtest [08:25:13]: test 0003-valkey-check-aof: - - - - - - - - - - results - - - - - - - - - - 265s autopkgtest [08:25:17]: test 0004-valkey-check-rdb: preparing testbed 267s Reading package lists... 267s Building dependency tree... 267s Reading state information... 267s Solving dependencies... 268s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 275s autopkgtest [08:25:27]: test 0004-valkey-check-rdb: [----------------------- 282s OK 282s [offset 0] Checking RDB file /var/lib/valkey/dump.rdb 282s [offset 27] AUX FIELD valkey-ver = '8.1.1' 282s [offset 41] AUX FIELD redis-bits = '32' 282s [offset 53] AUX FIELD ctime = '1751271934' 282s [offset 68] AUX FIELD used-mem = '2799992' 282s [offset 80] AUX FIELD aof-base = '0' 282s [offset 191] Selecting DB ID 0 282s [offset 566852] Checksum OK 282s [offset 566852] \o/ RDB looks OK! \o/ 282s [info] 5 keys read 282s [info] 0 expires 282s [info] 0 already expired 283s autopkgtest [08:25:35]: test 0004-valkey-check-rdb: -----------------------] 286s autopkgtest [08:25:38]: test 0004-valkey-check-rdb: - - - - - - - - - - results - - - - - - - - - - 286s 0004-valkey-check-rdb PASS 290s autopkgtest [08:25:42]: test 0005-cjson: preparing testbed 292s Reading package lists... 292s Building dependency tree... 292s Reading state information... 292s Solving dependencies... 293s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 300s autopkgtest [08:25:52]: test 0005-cjson: [----------------------- 307s 308s autopkgtest [08:26:00]: test 0005-cjson: -----------------------] 312s 0005-cjson PASS 312s autopkgtest [08:26:04]: test 0005-cjson: - - - - - - - - - - results - - - - - - - - - - 315s autopkgtest [08:26:07]: test 0006-migrate-from-redis: preparing testbed 338s autopkgtest [08:26:30]: testbed dpkg architecture: armhf 339s autopkgtest [08:26:31]: testbed apt version: 3.1.2 343s autopkgtest [08:26:35]: @@@@@@@@@@@@@@@@@@@@ test bed setup 345s autopkgtest [08:26:37]: testbed release detected to be: questing 352s autopkgtest [08:26:44]: updating testbed package index (apt update) 354s Get:1 http://ftpmaster.internal/ubuntu questing-proposed InRelease [249 kB] 355s Get:2 http://ftpmaster.internal/ubuntu questing InRelease [249 kB] 355s Get:3 http://ftpmaster.internal/ubuntu questing-updates InRelease [110 kB] 355s Get:4 http://ftpmaster.internal/ubuntu questing-security InRelease [110 kB] 355s Get:5 http://ftpmaster.internal/ubuntu questing-proposed/multiverse Sources [17.5 kB] 355s Get:6 http://ftpmaster.internal/ubuntu questing-proposed/main Sources [26.6 kB] 355s Get:7 http://ftpmaster.internal/ubuntu questing-proposed/universe Sources [429 kB] 355s Get:8 http://ftpmaster.internal/ubuntu questing-proposed/main armhf Packages [32.9 kB] 355s Get:9 http://ftpmaster.internal/ubuntu questing-proposed/universe armhf Packages [359 kB] 355s Get:10 http://ftpmaster.internal/ubuntu questing-proposed/multiverse armhf Packages [3452 B] 355s Get:11 http://ftpmaster.internal/ubuntu questing/universe Sources [21.3 MB] 356s Get:12 http://ftpmaster.internal/ubuntu questing/universe armhf Packages [15.3 MB] 360s Fetched 38.2 MB in 6s (6833 kB/s) 361s Reading package lists... 366s autopkgtest [08:26:58]: upgrading testbed (apt dist-upgrade and autopurge) 368s Reading package lists... 369s Building dependency tree... 369s Reading state information... 369s Calculating upgrade... 370s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 372s Reading package lists... 372s Building dependency tree... 372s Reading state information... 372s Solving dependencies... 373s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 375s autopkgtest [08:27:07]: rebooting testbed after setup commands that affected boot 437s Reading package lists... 437s Building dependency tree... 437s Reading state information... 437s Solving dependencies... 438s The following NEW packages will be installed: 438s liblzf1 redis-sentinel redis-server redis-tools 439s 0 upgraded, 4 newly installed, 0 to remove and 0 not upgraded. 439s Need to get 1308 kB of archives. 439s After this operation, 5361 kB of additional disk space will be used. 439s Get:1 http://ftpmaster.internal/ubuntu questing/universe armhf liblzf1 armhf 3.6-4 [6554 B] 439s Get:2 http://ftpmaster.internal/ubuntu questing-proposed/universe armhf redis-tools armhf 5:8.0.0-2 [1236 kB] 439s Get:3 http://ftpmaster.internal/ubuntu questing-proposed/universe armhf redis-sentinel armhf 5:8.0.0-2 [12.5 kB] 439s Get:4 http://ftpmaster.internal/ubuntu questing-proposed/universe armhf redis-server armhf 5:8.0.0-2 [53.2 kB] 440s Fetched 1308 kB in 1s (1449 kB/s) 440s Selecting previously unselected package liblzf1:armhf. 440s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 59851 files and directories currently installed.) 440s Preparing to unpack .../liblzf1_3.6-4_armhf.deb ... 440s Unpacking liblzf1:armhf (3.6-4) ... 440s Selecting previously unselected package redis-tools. 440s Preparing to unpack .../redis-tools_5%3a8.0.0-2_armhf.deb ... 440s Unpacking redis-tools (5:8.0.0-2) ... 441s Selecting previously unselected package redis-sentinel. 441s Preparing to unpack .../redis-sentinel_5%3a8.0.0-2_armhf.deb ... 441s Unpacking redis-sentinel (5:8.0.0-2) ... 441s Selecting previously unselected package redis-server. 441s Preparing to unpack .../redis-server_5%3a8.0.0-2_armhf.deb ... 441s Unpacking redis-server (5:8.0.0-2) ... 441s Setting up liblzf1:armhf (3.6-4) ... 441s Setting up redis-tools (5:8.0.0-2) ... 441s Setting up redis-server (5:8.0.0-2) ... 442s Created symlink '/etc/systemd/system/redis.service' → '/usr/lib/systemd/system/redis-server.service'. 442s Created symlink '/etc/systemd/system/multi-user.target.wants/redis-server.service' → '/usr/lib/systemd/system/redis-server.service'. 442s Setting up redis-sentinel (5:8.0.0-2) ... 443s Created symlink '/etc/systemd/system/sentinel.service' → '/usr/lib/systemd/system/redis-sentinel.service'. 443s Created symlink '/etc/systemd/system/multi-user.target.wants/redis-sentinel.service' → '/usr/lib/systemd/system/redis-sentinel.service'. 443s Processing triggers for man-db (2.13.1-1) ... 444s Processing triggers for libc-bin (2.41-6ubuntu2) ... 456s autopkgtest [08:28:28]: test 0006-migrate-from-redis: [----------------------- 458s + FLAG_FILE=/etc/valkey/REDIS_MIGRATION 458s + sed -i 's#loglevel notice#loglevel debug#' /etc/redis/redis.conf 458s + systemctl restart redis-server 458s + redis-cli -h 127.0.0.1 -p 6379 SET test 1 458s OK 458s + redis-cli -h 127.0.0.1 -p 6379 GET test 458s 1 458s + redis-cli -h 127.0.0.1 -p 6379 SAVE 458s OK 458s + sha256sum /var/lib/redis/dump.rdb 458s 813e1b90dcc579a7caf62e334991b39dc5cccf2e140129b3b1eb54619a90a220 /var/lib/redis/dump.rdb 458s + apt-get install -y valkey-redis-compat 458s Reading package lists... 458s Building dependency tree... 458s Reading state information... 458s Solving dependencies... 459s The following additional packages will be installed: 459s valkey-server valkey-tools 459s Suggested packages: 459s ruby-redis 459s The following packages will be REMOVED: 459s redis-sentinel redis-server redis-tools 459s The following NEW packages will be installed: 459s valkey-redis-compat valkey-server valkey-tools 460s 0 upgraded, 3 newly installed, 3 to remove and 0 not upgraded. 460s Need to get 1257 kB of archives. 460s After this operation, 220 kB disk space will be freed. 460s Get:1 http://ftpmaster.internal/ubuntu questing/universe armhf valkey-tools armhf 8.1.1+dfsg1-2ubuntu1 [1198 kB] 460s Get:2 http://ftpmaster.internal/ubuntu questing/universe armhf valkey-server armhf 8.1.1+dfsg1-2ubuntu1 [51.7 kB] 460s Get:3 http://ftpmaster.internal/ubuntu questing/universe armhf valkey-redis-compat all 8.1.1+dfsg1-2ubuntu1 [7794 B] 461s Fetched 1257 kB in 1s (1477 kB/s) 461s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 59900 files and directories currently installed.) 461s Removing redis-sentinel (5:8.0.0-2) ... 462s Removing redis-server (5:8.0.0-2) ... 462s Removing redis-tools (5:8.0.0-2) ... 462s Selecting previously unselected package valkey-tools. 462s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 59865 files and directories currently installed.) 462s Preparing to unpack .../valkey-tools_8.1.1+dfsg1-2ubuntu1_armhf.deb ... 462s Unpacking valkey-tools (8.1.1+dfsg1-2ubuntu1) ... 462s Selecting previously unselected package valkey-server. 462s Preparing to unpack .../valkey-server_8.1.1+dfsg1-2ubuntu1_armhf.deb ... 462s Unpacking valkey-server (8.1.1+dfsg1-2ubuntu1) ... 462s Selecting previously unselected package valkey-redis-compat. 462s Preparing to unpack .../valkey-redis-compat_8.1.1+dfsg1-2ubuntu1_all.deb ... 462s Unpacking valkey-redis-compat (8.1.1+dfsg1-2ubuntu1) ... 462s Setting up valkey-tools (8.1.1+dfsg1-2ubuntu1) ... 463s Setting up valkey-server (8.1.1+dfsg1-2ubuntu1) ... 463s Created symlink '/etc/systemd/system/valkey.service' → '/usr/lib/systemd/system/valkey-server.service'. 463s Created symlink '/etc/systemd/system/multi-user.target.wants/valkey-server.service' → '/usr/lib/systemd/system/valkey-server.service'. 464s Setting up valkey-redis-compat (8.1.1+dfsg1-2ubuntu1) ... 464s dpkg-query: no packages found matching valkey-sentinel 464s [I] /etc/redis/redis.conf has been copied to /etc/valkey/valkey.conf. Please, review the content of valkey.conf, especially if you had modified redis.conf. 464s [I] /etc/redis/sentinel.conf has been copied to /etc/valkey/sentinel.conf. Please, review the content of sentinel.conf, especially if you had modified sentinel.conf. 464s [I] On-disk redis dumps moved from /var/lib/redis/ to /var/lib/valkey. 464s Processing triggers for man-db (2.13.1-1) ... 464s + '[' -f /etc/valkey/REDIS_MIGRATION ']' 464s + sha256sum /var/lib/valkey/dump.rdb 464s 260357d37528be7d711220287d609b52f92ec2b7962da22b65bfc7ee81cf0d41 /var/lib/valkey/dump.rdb 464s + systemctl status valkey-server 464s + grep inactive 464s Active: inactive (dead) since Mon 2025-06-30 08:28:36 UTC; 784ms ago 464s + rm /etc/valkey/REDIS_MIGRATION 464s + systemctl start valkey-server 465s Job for valkey-server.service failed because the control process exited with error code. 465s See "systemctl status valkey-server.service" and "journalctl -xeu valkey-server.service" for details. 465s autopkgtest [08:28:37]: test 0006-migrate-from-redis: -----------------------] 469s autopkgtest [08:28:41]: test 0006-migrate-from-redis: - - - - - - - - - - results - - - - - - - - - - 469s 0006-migrate-from-redis FAIL non-zero exit status 1 472s autopkgtest [08:28:44]: @@@@@@@@@@@@@@@@@@@@ summary 472s 0001-valkey-cli PASS 472s 0002-benchmark PASS 472s 0003-valkey-check-aof PASS 472s 0004-valkey-check-rdb PASS 472s 0005-cjson PASS 472s 0006-migrate-from-redis FAIL non-zero exit status 1