0s autopkgtest [16:52:29]: starting date and time: 2025-03-15 16:52:29+0000 0s autopkgtest [16:52:29]: git checkout: 325255d2 Merge branch 'pin-any-arch' into 'ubuntu/production' 0s autopkgtest [16:52:29]: host juju-7f2275-prod-proposed-migration-environment-9; command line: /home/ubuntu/autopkgtest/runner/autopkgtest --output-dir /tmp/autopkgtest-work.bte7pkl6/out --timeout-copy=6000 --setup-commands 'ln -s /dev/null /etc/systemd/system/bluetooth.service; printf "http_proxy=http://squid.internal:3128\nhttps_proxy=http://squid.internal:3128\nno_proxy=127.0.0.1,127.0.1.1,login.ubuntu.com,localhost,localdomain,novalocal,internal,archive.ubuntu.com,ports.ubuntu.com,security.ubuntu.com,ddebs.ubuntu.com,changelogs.ubuntu.com,keyserver.ubuntu.com,launchpadlibrarian.net,launchpadcontent.net,launchpad.net,10.24.0.0/24,keystone.ps5.canonical.com,objectstorage.prodstack5.canonical.com,radosgw.ps5.canonical.com\n" >> /etc/environment' --apt-pocket=proposed=src:glibc --apt-upgrade redis --timeout-short=300 --timeout-copy=20000 --timeout-build=20000 --env=ADT_TEST_TRIGGERS=glibc/2.41-1ubuntu2 -- lxd -r lxd-armhf-10.145.243.240 lxd-armhf-10.145.243.240:autopkgtest/ubuntu/plucky/armhf 21s autopkgtest [16:52:50]: testbed dpkg architecture: armhf 22s autopkgtest [16:52:51]: testbed apt version: 2.9.33 26s autopkgtest [16:52:55]: @@@@@@@@@@@@@@@@@@@@ test bed setup 28s autopkgtest [16:52:57]: testbed release detected to be: None 35s autopkgtest [16:53:04]: updating testbed package index (apt update) 37s Get:1 http://ftpmaster.internal/ubuntu plucky-proposed InRelease [126 kB] 38s Get:2 http://ftpmaster.internal/ubuntu plucky InRelease [257 kB] 39s Get:3 http://ftpmaster.internal/ubuntu plucky-updates InRelease [126 kB] 39s Get:4 http://ftpmaster.internal/ubuntu plucky-security InRelease [126 kB] 39s Get:5 http://ftpmaster.internal/ubuntu plucky-proposed/multiverse Sources [15.8 kB] 39s Get:6 http://ftpmaster.internal/ubuntu plucky-proposed/universe Sources [379 kB] 39s Get:7 http://ftpmaster.internal/ubuntu plucky-proposed/main Sources [99.7 kB] 39s Get:8 http://ftpmaster.internal/ubuntu plucky-proposed/main armhf Packages [114 kB] 39s Get:9 http://ftpmaster.internal/ubuntu plucky-proposed/main armhf c-n-f Metadata [1832 B] 39s Get:10 http://ftpmaster.internal/ubuntu plucky-proposed/restricted armhf c-n-f Metadata [116 B] 39s Get:11 http://ftpmaster.internal/ubuntu plucky-proposed/universe armhf Packages [312 kB] 39s Get:12 http://ftpmaster.internal/ubuntu plucky-proposed/universe armhf c-n-f Metadata [11.1 kB] 39s Get:13 http://ftpmaster.internal/ubuntu plucky-proposed/multiverse armhf Packages [3472 B] 39s Get:14 http://ftpmaster.internal/ubuntu plucky-proposed/multiverse armhf c-n-f Metadata [240 B] 39s Get:15 http://ftpmaster.internal/ubuntu plucky/universe Sources [21.0 MB] 62s Get:16 http://ftpmaster.internal/ubuntu plucky/multiverse Sources [299 kB] 62s Get:17 http://ftpmaster.internal/ubuntu plucky/main Sources [1394 kB] 64s Get:18 http://ftpmaster.internal/ubuntu plucky/main armhf Packages [1378 kB] 66s Get:19 http://ftpmaster.internal/ubuntu plucky/main armhf c-n-f Metadata [29.4 kB] 66s Get:20 http://ftpmaster.internal/ubuntu plucky/restricted armhf c-n-f Metadata [108 B] 66s Get:21 http://ftpmaster.internal/ubuntu plucky/universe armhf Packages [15.1 MB] 81s Get:22 http://ftpmaster.internal/ubuntu plucky/multiverse armhf Packages [172 kB] 83s Fetched 41.0 MB in 46s (897 kB/s) 84s Reading package lists... 90s autopkgtest [16:53:59]: upgrading testbed (apt dist-upgrade and autopurge) 92s Reading package lists... 92s Building dependency tree... 92s Reading state information... 93s Calculating upgrade...Starting pkgProblemResolver with broken count: 0 93s Starting 2 pkgProblemResolver with broken count: 0 95s Done 95s Entering ResolveByKeep 95s 95s Calculating upgrade... 95s The following packages will be upgraded: 95s libc-bin libc6 locales pinentry-curses python3-jinja2 sos strace 95s 7 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 95s Need to get 8683 kB of archives. 95s After this operation, 23.6 kB of additional disk space will be used. 95s Get:1 http://ftpmaster.internal/ubuntu plucky-proposed/main armhf libc6 armhf 2.41-1ubuntu2 [2932 kB] 98s Get:2 http://ftpmaster.internal/ubuntu plucky-proposed/main armhf libc-bin armhf 2.41-1ubuntu2 [545 kB] 99s Get:3 http://ftpmaster.internal/ubuntu plucky-proposed/main armhf locales all 2.41-1ubuntu2 [4246 kB] 103s Get:4 http://ftpmaster.internal/ubuntu plucky/main armhf strace armhf 6.13+ds-1ubuntu1 [445 kB] 104s Get:5 http://ftpmaster.internal/ubuntu plucky/main armhf pinentry-curses armhf 1.3.1-2ubuntu3 [40.6 kB] 104s Get:6 http://ftpmaster.internal/ubuntu plucky/main armhf python3-jinja2 all 3.1.5-2ubuntu1 [109 kB] 104s Get:7 http://ftpmaster.internal/ubuntu plucky/main armhf sos all 4.9.0-5 [365 kB] 105s Preconfiguring packages ... 105s Fetched 8683 kB in 10s (875 kB/s) 105s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 64655 files and directories currently installed.) 105s Preparing to unpack .../libc6_2.41-1ubuntu2_armhf.deb ... 105s Unpacking libc6:armhf (2.41-1ubuntu2) over (2.41-1ubuntu1) ... 105s Setting up libc6:armhf (2.41-1ubuntu2) ... 106s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 64655 files and directories currently installed.) 106s Preparing to unpack .../libc-bin_2.41-1ubuntu2_armhf.deb ... 106s Unpacking libc-bin (2.41-1ubuntu2) over (2.41-1ubuntu1) ... 106s Setting up libc-bin (2.41-1ubuntu2) ... 106s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 64655 files and directories currently installed.) 106s Preparing to unpack .../locales_2.41-1ubuntu2_all.deb ... 106s Unpacking locales (2.41-1ubuntu2) over (2.41-1ubuntu1) ... 106s Preparing to unpack .../strace_6.13+ds-1ubuntu1_armhf.deb ... 106s Unpacking strace (6.13+ds-1ubuntu1) over (6.11-0ubuntu1) ... 106s Preparing to unpack .../pinentry-curses_1.3.1-2ubuntu3_armhf.deb ... 106s Unpacking pinentry-curses (1.3.1-2ubuntu3) over (1.3.1-2ubuntu2) ... 106s Preparing to unpack .../python3-jinja2_3.1.5-2ubuntu1_all.deb ... 106s Unpacking python3-jinja2 (3.1.5-2ubuntu1) over (3.1.5-2) ... 106s Preparing to unpack .../archives/sos_4.9.0-5_all.deb ... 107s Unpacking sos (4.9.0-5) over (4.9.0-4) ... 107s Setting up sos (4.9.0-5) ... 107s Setting up pinentry-curses (1.3.1-2ubuntu3) ... 107s Setting up locales (2.41-1ubuntu2) ... 108s Generating locales (this might take a while)... 110s en_US.UTF-8... done 110s Generation complete. 110s Setting up python3-jinja2 (3.1.5-2ubuntu1) ... 110s Setting up strace (6.13+ds-1ubuntu1) ... 110s Processing triggers for man-db (2.13.0-1) ... 111s Processing triggers for systemd (257.3-1ubuntu3) ... 114s Reading package lists... 114s Building dependency tree... 114s Reading state information... 115s Starting pkgProblemResolver with broken count: 0 115s Starting 2 pkgProblemResolver with broken count: 0 115s Done 115s Solving dependencies... 115s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 118s autopkgtest [16:54:27]: rebooting testbed after setup commands that affected boot 158s autopkgtest [16:55:07]: testbed running kernel: Linux 6.8.0-52-generic #53~22.04.1-Ubuntu SMP PREEMPT_DYNAMIC Wed Jan 15 18:10:51 UTC 2 182s autopkgtest [16:55:31]: @@@@@@@@@@@@@@@@@@@@ apt-source redis 212s Get:1 http://ftpmaster.internal/ubuntu plucky/universe redis 5:7.0.15-3 (dsc) [2273 B] 212s Get:2 http://ftpmaster.internal/ubuntu plucky/universe redis 5:7.0.15-3 (tar) [3026 kB] 212s Get:3 http://ftpmaster.internal/ubuntu plucky/universe redis 5:7.0.15-3 (diff) [31.7 kB] 212s gpgv: Signature made Tue Jan 21 10:13:21 2025 UTC 212s gpgv: using RSA key C2FE4BD271C139B86C533E461E953E27D4311E58 212s gpgv: Can't check signature: No public key 212s dpkg-source: warning: cannot verify inline signature for ./redis_7.0.15-3.dsc: no acceptable signature found 213s autopkgtest [16:56:02]: testing package redis version 5:7.0.15-3 216s autopkgtest [16:56:05]: build not needed 221s autopkgtest [16:56:10]: test 0001-redis-cli: preparing testbed 222s Reading package lists... 223s Building dependency tree... 223s Reading state information... 223s Starting pkgProblemResolver with broken count: 0 223s Starting 2 pkgProblemResolver with broken count: 0 223s Done 224s The following NEW packages will be installed: 224s liblzf1 redis redis-sentinel redis-server redis-tools 224s 0 upgraded, 5 newly installed, 0 to remove and 0 not upgraded. 224s Need to get 1011 kB of archives. 224s After this operation, 4316 kB of additional disk space will be used. 224s Get:1 http://ftpmaster.internal/ubuntu plucky/universe armhf liblzf1 armhf 3.6-4 [6554 B] 224s Get:2 http://ftpmaster.internal/ubuntu plucky/universe armhf redis-tools armhf 5:7.0.15-3 [937 kB] 225s Get:3 http://ftpmaster.internal/ubuntu plucky/universe armhf redis-sentinel armhf 5:7.0.15-3 [12.2 kB] 225s Get:4 http://ftpmaster.internal/ubuntu plucky/universe armhf redis-server armhf 5:7.0.15-3 [51.7 kB] 226s Get:5 http://ftpmaster.internal/ubuntu plucky/universe armhf redis all 5:7.0.15-3 [2914 B] 226s Fetched 1011 kB in 2s (636 kB/s) 226s Selecting previously unselected package liblzf1:armhf. 226s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 64655 files and directories currently installed.) 226s Preparing to unpack .../liblzf1_3.6-4_armhf.deb ... 226s Unpacking liblzf1:armhf (3.6-4) ... 226s Selecting previously unselected package redis-tools. 226s Preparing to unpack .../redis-tools_5%3a7.0.15-3_armhf.deb ... 226s Unpacking redis-tools (5:7.0.15-3) ... 226s Selecting previously unselected package redis-sentinel. 226s Preparing to unpack .../redis-sentinel_5%3a7.0.15-3_armhf.deb ... 226s Unpacking redis-sentinel (5:7.0.15-3) ... 226s Selecting previously unselected package redis-server. 226s Preparing to unpack .../redis-server_5%3a7.0.15-3_armhf.deb ... 226s Unpacking redis-server (5:7.0.15-3) ... 226s Selecting previously unselected package redis. 226s Preparing to unpack .../redis_5%3a7.0.15-3_all.deb ... 226s Unpacking redis (5:7.0.15-3) ... 226s Setting up liblzf1:armhf (3.6-4) ... 226s Setting up redis-tools (5:7.0.15-3) ... 226s Setting up redis-server (5:7.0.15-3) ... 227s Created symlink '/etc/systemd/system/redis.service' → '/usr/lib/systemd/system/redis-server.service'. 227s Created symlink '/etc/systemd/system/multi-user.target.wants/redis-server.service' → '/usr/lib/systemd/system/redis-server.service'. 227s Setting up redis-sentinel (5:7.0.15-3) ... 228s Created symlink '/etc/systemd/system/sentinel.service' → '/usr/lib/systemd/system/redis-sentinel.service'. 228s Created symlink '/etc/systemd/system/multi-user.target.wants/redis-sentinel.service' → '/usr/lib/systemd/system/redis-sentinel.service'. 228s Setting up redis (5:7.0.15-3) ... 228s Processing triggers for man-db (2.13.0-1) ... 228s Processing triggers for libc-bin (2.41-1ubuntu2) ... 241s autopkgtest [16:56:30]: test 0001-redis-cli: [----------------------- 248s # Server 248s redis_version:7.0.15 248s redis_git_sha1:00000000 248s redis_git_dirty:0 248s redis_build_id:1369a98afcafaf0 248s redis_mode:standalone 248s os:Linux 6.8.0-52-generic armv7l 248s arch_bits:32 248s monotonic_clock:POSIX clock_gettime 248s multiplexing_api:epoll 248s atomicvar_api:c11-builtin 248s gcc_version:14.2.0 248s process_id:1270 248s process_supervised:systemd 248s run_id:4a9e22258a7681aa50389a81e868085d9ab0a3c6 248s tcp_port:6379 248s server_time_usec:1742057797942574 248s uptime_in_seconds:5 248s uptime_in_days:0 248s hz:10 248s configured_hz:10 248s lru_clock:14004549 248s executable:/usr/bin/redis-server 248s config_file:/etc/redis/redis.conf 248s io_threads_active:0 248s 248s # Clients 248s connected_clients:3 248s cluster_connections:0 248s maxclients:10000 248s client_recent_max_input_buffer:20480 248s client_recent_max_output_buffer:0 248s blocked_clients:0 248s tracking_clients:0 248s clients_in_timeout_table:0 248s 248s # Memory 248s used_memory:917016 248s used_memory_human:895.52K 248s used_memory_rss:9699328 248s used_memory_rss_human:9.25M 248s used_memory_peak:917016 248s used_memory_peak_human:895.52K 248s used_memory_peak_perc:102.24% 248s used_memory_overhead:752848 248s used_memory_startup:707376 248s used_memory_dataset:164168 248s used_memory_dataset_perc:78.31% 248s allocator_allocated:4528544 248s allocator_active:10485760 248s allocator_resident:11206656 248s total_system_memory:3844112384 248s total_system_memory_human:3.58G 248s used_memory_lua:22528 248s used_memory_vm_eval:22528 248s used_memory_lua_human:22.00K 248s used_memory_scripts_eval:0 248s number_of_cached_scripts:0 248s number_of_functions:0 248s number_of_libraries:0 248s used_memory_vm_functions:23552 248s used_memory_vm_total:46080 248s used_memory_vm_total_human:45.00K 248s used_memory_functions:120 248s used_memory_scripts:120 248s used_memory_scripts_human:120B 248s maxmemory:3221225472 248s maxmemory_human:3.00G 248s maxmemory_policy:noeviction 248s allocator_frag_ratio:2.32 248s allocator_frag_bytes:5957216 248s allocator_rss_ratio:1.07 248s allocator_rss_bytes:720896 248s rss_overhead_ratio:0.87 248s rss_overhead_bytes:-1507328 248s mem_fragmentation_ratio:11.05 248s mem_fragmentation_bytes:8821880 248s mem_not_counted_for_evict:0 248s mem_replication_backlog:0 248s mem_total_replication_buffers:0 248s mem_clients_slaves:0 248s mem_clients_normal:45352 248s mem_cluster_links:0 248s mem_aof_buffer:0 248s mem_allocator:jemalloc-5.3.0 248s active_defrag_running:0 248s lazyfree_pending_objects:0 248s lazyfreed_objects:0 248s 248s # Persistence 248s loading:0 248s async_loading:0 248s current_cow_peak:0 248s current_cow_size:0 248s current_cow_size_age:0 248s current_fork_perc:0.00 248s current_save_keys_processed:0 248s current_save_keys_total:0 248s rdb_changes_since_last_save:0 248s rdb_bgsave_in_progress:0 248s rdb_last_save_time:1742057792 248s rdb_last_bgsave_status:ok 248s rdb_last_bgsave_time_sec:-1 248s rdb_current_bgsave_time_sec:-1 248s rdb_saves:0 248s rdb_last_cow_size:0 248s rdb_last_load_keys_expired:0 248s rdb_last_load_keys_loaded:0 248s aof_enabled:0 248s aof_rewrite_in_progress:0 248s aof_rewrite_scheduled:0 248s aof_last_rewrite_time_sec:-1 248s aof_current_rewrite_time_sec:-1 248s aof_last_bgrewrite_status:ok 248s aof_rewrites:0 248s aof_rewrites_consecutive_failures:0 248s aof_last_write_status:ok 248s aof_last_cow_size:0 248s module_fork_in_progress:0 248s module_fork_last_cow_size:0 248s 248s # Stats 248s total_connections_received:3 248s total_commands_processed:11 248s instantaneous_ops_per_sec:1 248s total_net_input_bytes:644 248s total_net_output_bytes:5296 248s total_net_repl_input_bytes:0 248s total_net_repl_output_bytes:0 248s instantaneous_input_kbps:0.02 248s instantaneous_output_kbps:2.98 248s instantaneous_input_repl_kbps:0.00 248s instantaneous_output_repl_kbps:0.00 248s rejected_connections:0 248s sync_full:0 248s sync_partial_ok:0 248s sync_partial_err:0 248s expired_keys:0 248s expired_stale_perc:0.00 248s expired_time_cap_reached_count:0 248s expire_cycle_cpu_milliseconds:0 248s evicted_keys:0 248s evicted_clients:0 248s total_eviction_exceeded_time:0 248s current_eviction_exceeded_time:0 248s keyspace_hits:0 248s keyspace_misses:0 248s pubsub_channels:1 248s pubsub_patterns:0 248s pubsubshard_channels:0 248s latest_fork_usec:0 248s total_forks:0 248s migrate_cached_sockets:0 248s slave_expires_tracked_keys:0 248s active_defrag_hits:0 248s active_defrag_misses:0 248s active_defrag_key_hits:0 248s active_defrag_key_misses:0 248s total_active_defrag_time:0 248s current_active_defrag_time:0 248s tracking_total_keys:0 248s tracking_total_items:0 248s tracking_total_prefixes:0 248s unexpected_error_replies:0 248s total_error_replies:0 248s dump_payload_sanitizations:0 248s total_reads_processed:8 248s total_writes_processed:9 248s io_threaded_reads_processed:0 248s io_threaded_writes_processed:0 248s reply_buffer_shrinks:2 248s reply_buffer_expands:1 248s 248s # Replication 248s role:master 248s connected_slaves:0 248s master_failover_state:no-failover 248s master_replid:6661b9a70cbc3050470f9d6bdf0129c5e45cd271 248s master_replid2:0000000000000000000000000000000000000000 248s master_repl_offset:0 248s second_repl_offset:-1 248s repl_backlog_active:0 248s repl_backlog_size:1048576 248s repl_backlog_first_byte_offset:0 248s repl_backlog_histlen:0 248s 248s # CPU 248s used_cpu_sys:0.049309 248s used_cpu_user:0.052457 248s used_cpu_sys_children:0.000497 248s used_cpu_user_children:0.000582 248s used_cpu_sys_main_thread:0.049106 248s used_cpu_user_main_thread:0.052241 248s 248s # Modules 248s 248s # Errorstats 248s 248s # Cluster 248s cluster_enabled:0 248s 248s # Keyspace 248s Redis ver. 7.0.15 249s autopkgtest [16:56:38]: test 0001-redis-cli: -----------------------] 253s autopkgtest [16:56:42]: test 0001-redis-cli: - - - - - - - - - - results - - - - - - - - - - 253s 0001-redis-cli PASS 257s autopkgtest [16:56:46]: test 0002-benchmark: preparing testbed 258s Reading package lists... 259s Building dependency tree... 259s Reading state information... 259s Starting pkgProblemResolver with broken count: 0 259s Starting 2 pkgProblemResolver with broken count: 0 259s Done 260s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 273s autopkgtest [16:57:02]: test 0002-benchmark: [----------------------- 281s PING_INLINE: rps=0.0 (overall: 0.0) avg_msec=nan (overall: nan) PING_INLINE: rps=337091.7 (overall: 335754.0) avg_msec=1.252 (overall: 1.252) ====== PING_INLINE ====== 281s 100000 requests completed in 0.30 seconds 281s 50 parallel clients 281s 3 bytes payload 281s keep alive: 1 281s host configuration "save": 3600 1 300 100 60 10000 281s host configuration "appendonly": no 281s multi-thread: no 281s 281s Latency by percentile distribution: 281s 0.000% <= 0.495 milliseconds (cumulative count 10) 281s 50.000% <= 1.231 milliseconds (cumulative count 51290) 281s 75.000% <= 1.447 milliseconds (cumulative count 75220) 281s 87.500% <= 1.615 milliseconds (cumulative count 87870) 281s 93.750% <= 1.711 milliseconds (cumulative count 94030) 281s 96.875% <= 1.775 milliseconds (cumulative count 97030) 281s 98.438% <= 1.831 milliseconds (cumulative count 98480) 281s 99.219% <= 1.895 milliseconds (cumulative count 99300) 281s 99.609% <= 1.943 milliseconds (cumulative count 99620) 281s 99.805% <= 1.991 milliseconds (cumulative count 99830) 281s 99.902% <= 2.039 milliseconds (cumulative count 99920) 281s 99.951% <= 2.055 milliseconds (cumulative count 99960) 281s 99.976% <= 2.103 milliseconds (cumulative count 99980) 281s 99.988% <= 2.111 milliseconds (cumulative count 99990) 281s 99.994% <= 2.231 milliseconds (cumulative count 100000) 281s 100.000% <= 2.231 milliseconds (cumulative count 100000) 281s 281s Cumulative distribution of latencies: 281s 0.000% <= 0.103 milliseconds (cumulative count 0) 281s 0.010% <= 0.503 milliseconds (cumulative count 10) 281s 0.220% <= 0.607 milliseconds (cumulative count 220) 281s 1.680% <= 0.703 milliseconds (cumulative count 1680) 281s 4.860% <= 0.807 milliseconds (cumulative count 4860) 281s 9.830% <= 0.903 milliseconds (cumulative count 9830) 281s 18.740% <= 1.007 milliseconds (cumulative count 18740) 281s 31.020% <= 1.103 milliseconds (cumulative count 31020) 281s 47.710% <= 1.207 milliseconds (cumulative count 47710) 281s 60.870% <= 1.303 milliseconds (cumulative count 60870) 281s 71.860% <= 1.407 milliseconds (cumulative count 71860) 281s 79.800% <= 1.503 milliseconds (cumulative count 79800) 281s 87.420% <= 1.607 milliseconds (cumulative count 87420) 281s 93.530% <= 1.703 milliseconds (cumulative count 93530) 281s 98.030% <= 1.807 milliseconds (cumulative count 98030) 281s 99.350% <= 1.903 milliseconds (cumulative count 99350) 281s 99.850% <= 2.007 milliseconds (cumulative count 99850) 281s 99.980% <= 2.103 milliseconds (cumulative count 99980) 281s 100.000% <= 3.103 milliseconds (cumulative count 100000) 281s 281s Summary: 281s throughput summary: 335570.47 requests per second 281s latency summary (msec): 281s avg min p50 p95 p99 max 281s 1.252 0.488 1.231 1.735 1.871 2.231 281s PING_MBULK: rps=288565.8 (overall: 360348.2) avg_msec=1.098 (overall: 1.098) ====== PING_MBULK ====== 281s 100000 requests completed in 0.28 seconds 281s 50 parallel clients 281s 3 bytes payload 281s keep alive: 1 281s host configuration "save": 3600 1 300 100 60 10000 281s host configuration "appendonly": no 281s multi-thread: no 281s 281s Latency by percentile distribution: 281s 0.000% <= 0.487 milliseconds (cumulative count 30) 281s 50.000% <= 1.079 milliseconds (cumulative count 50900) 281s 75.000% <= 1.279 milliseconds (cumulative count 75960) 281s 87.500% <= 1.439 milliseconds (cumulative count 87870) 281s 93.750% <= 1.559 milliseconds (cumulative count 93930) 281s 96.875% <= 1.647 milliseconds (cumulative count 97020) 281s 98.438% <= 1.719 milliseconds (cumulative count 98470) 281s 99.219% <= 1.791 milliseconds (cumulative count 99220) 281s 99.609% <= 1.871 milliseconds (cumulative count 99640) 281s 99.805% <= 1.927 milliseconds (cumulative count 99810) 281s 99.902% <= 1.991 milliseconds (cumulative count 99910) 281s 99.951% <= 2.079 milliseconds (cumulative count 99960) 281s 99.976% <= 2.151 milliseconds (cumulative count 99980) 281s 99.988% <= 2.199 milliseconds (cumulative count 99990) 281s 99.994% <= 2.255 milliseconds (cumulative count 100000) 281s 100.000% <= 2.255 milliseconds (cumulative count 100000) 281s 281s Cumulative distribution of latencies: 281s 0.000% <= 0.103 milliseconds (cumulative count 0) 281s 0.060% <= 0.503 milliseconds (cumulative count 60) 281s 1.760% <= 0.607 milliseconds (cumulative count 1760) 281s 7.410% <= 0.703 milliseconds (cumulative count 7410) 281s 17.000% <= 0.807 milliseconds (cumulative count 17000) 281s 27.140% <= 0.903 milliseconds (cumulative count 27140) 281s 40.210% <= 1.007 milliseconds (cumulative count 40210) 281s 54.280% <= 1.103 milliseconds (cumulative count 54280) 281s 68.190% <= 1.207 milliseconds (cumulative count 68190) 281s 78.160% <= 1.303 milliseconds (cumulative count 78160) 281s 85.980% <= 1.407 milliseconds (cumulative count 85980) 281s 91.270% <= 1.503 milliseconds (cumulative count 91270) 281s 95.830% <= 1.607 milliseconds (cumulative count 95830) 281s 98.190% <= 1.703 milliseconds (cumulative count 98190) 281s 99.330% <= 1.807 milliseconds (cumulative count 99330) 281s 99.730% <= 1.903 milliseconds (cumulative count 99730) 281s 99.910% <= 2.007 milliseconds (cumulative count 99910) 281s 99.960% <= 2.103 milliseconds (cumulative count 99960) 281s 100.000% <= 3.103 milliseconds (cumulative count 100000) 281s 281s Summary: 281s throughput summary: 361010.81 requests per second 281s latency summary (msec): 281s avg min p50 p95 p99 max 281s 1.092 0.480 1.079 1.591 1.767 2.255 282s SET: rps=204800.0 (overall: 301176.5) avg_msec=1.410 (overall: 1.410) ====== SET ====== 282s 100000 requests completed in 0.33 seconds 282s 50 parallel clients 282s 3 bytes payload 282s keep alive: 1 282s host configuration "save": 3600 1 300 100 60 10000 282s host configuration "appendonly": no 282s multi-thread: no 282s 282s Latency by percentile distribution: 282s 0.000% <= 0.543 milliseconds (cumulative count 10) 282s 50.000% <= 1.375 milliseconds (cumulative count 50200) 282s 75.000% <= 1.607 milliseconds (cumulative count 75090) 282s 87.500% <= 1.791 milliseconds (cumulative count 87960) 282s 93.750% <= 1.903 milliseconds (cumulative count 93750) 282s 96.875% <= 1.999 milliseconds (cumulative count 97020) 282s 98.438% <= 2.079 milliseconds (cumulative count 98450) 282s 99.219% <= 2.167 milliseconds (cumulative count 99230) 282s 99.609% <= 2.239 milliseconds (cumulative count 99620) 282s 99.805% <= 2.311 milliseconds (cumulative count 99820) 282s 99.902% <= 2.383 milliseconds (cumulative count 99920) 282s 99.951% <= 2.455 milliseconds (cumulative count 99960) 282s 99.976% <= 2.487 milliseconds (cumulative count 99980) 282s 99.988% <= 2.559 milliseconds (cumulative count 99990) 282s 99.994% <= 2.567 milliseconds (cumulative count 100000) 282s 100.000% <= 2.567 milliseconds (cumulative count 100000) 282s 282s Cumulative distribution of latencies: 282s 0.000% <= 0.103 milliseconds (cumulative count 0) 282s 0.060% <= 0.607 milliseconds (cumulative count 60) 282s 0.270% <= 0.703 milliseconds (cumulative count 270) 282s 1.510% <= 0.807 milliseconds (cumulative count 1510) 282s 3.610% <= 0.903 milliseconds (cumulative count 3610) 282s 7.630% <= 1.007 milliseconds (cumulative count 7630) 282s 14.190% <= 1.103 milliseconds (cumulative count 14190) 282s 26.570% <= 1.207 milliseconds (cumulative count 26570) 282s 40.620% <= 1.303 milliseconds (cumulative count 40620) 282s 54.480% <= 1.407 milliseconds (cumulative count 54480) 282s 65.480% <= 1.503 milliseconds (cumulative count 65480) 282s 75.090% <= 1.607 milliseconds (cumulative count 75090) 282s 82.380% <= 1.703 milliseconds (cumulative count 82380) 282s 88.890% <= 1.807 milliseconds (cumulative count 88890) 282s 93.750% <= 1.903 milliseconds (cumulative count 93750) 282s 97.200% <= 2.007 milliseconds (cumulative count 97200) 282s 98.670% <= 2.103 milliseconds (cumulative count 98670) 282s 100.000% <= 3.103 milliseconds (cumulative count 100000) 282s 282s Summary: 282s throughput summary: 305810.41 requests per second 282s latency summary (msec): 282s avg min p50 p95 p99 max 282s 1.406 0.536 1.375 1.935 2.143 2.567 282s GET: rps=154023.9 (overall: 420217.4) avg_msec=1.065 (overall: 1.065) ====== GET ====== 282s 100000 requests completed in 0.23 seconds 282s 50 parallel clients 282s 3 bytes payload 282s keep alive: 1 282s host configuration "save": 3600 1 300 100 60 10000 282s host configuration "appendonly": no 282s multi-thread: no 282s 282s Latency by percentile distribution: 282s 0.000% <= 0.295 milliseconds (cumulative count 10) 282s 50.000% <= 1.055 milliseconds (cumulative count 50040) 282s 75.000% <= 1.199 milliseconds (cumulative count 75070) 282s 87.500% <= 1.295 milliseconds (cumulative count 87670) 282s 93.750% <= 1.367 milliseconds (cumulative count 94020) 282s 96.875% <= 1.439 milliseconds (cumulative count 97120) 282s 98.438% <= 1.519 milliseconds (cumulative count 98520) 282s 99.219% <= 1.647 milliseconds (cumulative count 99220) 282s 99.609% <= 1.783 milliseconds (cumulative count 99630) 282s 99.805% <= 1.887 milliseconds (cumulative count 99810) 282s 99.902% <= 1.967 milliseconds (cumulative count 99910) 282s 99.951% <= 2.015 milliseconds (cumulative count 99960) 282s 99.976% <= 2.063 milliseconds (cumulative count 99980) 282s 99.988% <= 2.087 milliseconds (cumulative count 99990) 282s 99.994% <= 2.135 milliseconds (cumulative count 100000) 282s 100.000% <= 2.135 milliseconds (cumulative count 100000) 282s 282s Cumulative distribution of latencies: 282s 0.000% <= 0.103 milliseconds (cumulative count 0) 282s 0.020% <= 0.303 milliseconds (cumulative count 20) 282s 0.170% <= 0.407 milliseconds (cumulative count 170) 282s 0.510% <= 0.503 milliseconds (cumulative count 510) 282s 1.130% <= 0.607 milliseconds (cumulative count 1130) 282s 2.890% <= 0.703 milliseconds (cumulative count 2890) 282s 10.690% <= 0.807 milliseconds (cumulative count 10690) 282s 23.010% <= 0.903 milliseconds (cumulative count 23010) 282s 40.430% <= 1.007 milliseconds (cumulative count 40430) 282s 59.090% <= 1.103 milliseconds (cumulative count 59090) 282s 76.290% <= 1.207 milliseconds (cumulative count 76290) 282s 88.590% <= 1.303 milliseconds (cumulative count 88590) 282s 96.050% <= 1.407 milliseconds (cumulative count 96050) 282s 98.330% <= 1.503 milliseconds (cumulative count 98330) 282s 99.060% <= 1.607 milliseconds (cumulative count 99060) 282s 99.400% <= 1.703 milliseconds (cumulative count 99400) 282s 99.670% <= 1.807 milliseconds (cumulative count 99670) 282s 99.830% <= 1.903 milliseconds (cumulative count 99830) 282s 99.950% <= 2.007 milliseconds (cumulative count 99950) 282s 99.990% <= 2.103 milliseconds (cumulative count 99990) 282s 100.000% <= 3.103 milliseconds (cumulative count 100000) 282s 282s Summary: 282s throughput summary: 425531.91 requests per second 282s latency summary (msec): 282s avg min p50 p95 p99 max 282s 1.061 0.288 1.055 1.391 1.591 2.135 282s INCR: rps=176760.0 (overall: 420857.2) avg_msec=1.076 (overall: 1.076) ====== INCR ====== 282s 100000 requests completed in 0.23 seconds 282s 50 parallel clients 282s 3 bytes payload 282s keep alive: 1 282s host configuration "save": 3600 1 300 100 60 10000 282s host configuration "appendonly": no 282s multi-thread: no 282s 282s Latency by percentile distribution: 282s 0.000% <= 0.319 milliseconds (cumulative count 10) 282s 50.000% <= 1.055 milliseconds (cumulative count 50100) 282s 75.000% <= 1.191 milliseconds (cumulative count 75530) 282s 87.500% <= 1.279 milliseconds (cumulative count 87660) 282s 93.750% <= 1.351 milliseconds (cumulative count 94190) 282s 96.875% <= 1.415 milliseconds (cumulative count 96970) 282s 98.438% <= 1.487 milliseconds (cumulative count 98520) 282s 99.219% <= 1.551 milliseconds (cumulative count 99270) 282s 99.609% <= 1.599 milliseconds (cumulative count 99630) 282s 99.805% <= 1.647 milliseconds (cumulative count 99810) 282s 99.902% <= 1.687 milliseconds (cumulative count 99910) 282s 99.951% <= 1.711 milliseconds (cumulative count 99960) 282s 99.976% <= 1.735 milliseconds (cumulative count 99980) 282s 99.988% <= 1.751 milliseconds (cumulative count 99990) 282s 99.994% <= 1.847 milliseconds (cumulative count 100000) 282s 100.000% <= 1.847 milliseconds (cumulative count 100000) 282s 282s Cumulative distribution of latencies: 282s 0.000% <= 0.103 milliseconds (cumulative count 0) 282s 0.120% <= 0.407 milliseconds (cumulative count 120) 282s 0.230% <= 0.503 milliseconds (cumulative count 230) 282s 0.370% <= 0.607 milliseconds (cumulative count 370) 282s 1.070% <= 0.703 milliseconds (cumulative count 1070) 282s 10.020% <= 0.807 milliseconds (cumulative count 10020) 282s 21.640% <= 0.903 milliseconds (cumulative count 21640) 282s 40.120% <= 1.007 milliseconds (cumulative count 40120) 282s 60.060% <= 1.103 milliseconds (cumulative count 60060) 282s 77.750% <= 1.207 milliseconds (cumulative count 77750) 282s 90.400% <= 1.303 milliseconds (cumulative count 90400) 282s 96.760% <= 1.407 milliseconds (cumulative count 96760) 282s 98.750% <= 1.503 milliseconds (cumulative count 98750) 282s 99.650% <= 1.607 milliseconds (cumulative count 99650) 282s 99.930% <= 1.703 milliseconds (cumulative count 99930) 282s 99.990% <= 1.807 milliseconds (cumulative count 99990) 282s 100.000% <= 1.903 milliseconds (cumulative count 100000) 282s 282s Summary: 282s throughput summary: 427350.44 requests per second 282s latency summary (msec): 282s avg min p50 p95 p99 max 282s 1.059 0.312 1.055 1.367 1.535 1.847 282s LPUSH: rps=159920.3 (overall: 334500.0) avg_msec=1.370 (overall: 1.370) ====== LPUSH ====== 282s 100000 requests completed in 0.30 seconds 282s 50 parallel clients 282s 3 bytes payload 282s keep alive: 1 282s host configuration "save": 3600 1 300 100 60 10000 282s host configuration "appendonly": no 282s multi-thread: no 282s 282s Latency by percentile distribution: 282s 0.000% <= 0.399 milliseconds (cumulative count 10) 282s 50.000% <= 1.359 milliseconds (cumulative count 50110) 282s 75.000% <= 1.519 milliseconds (cumulative count 75610) 282s 87.500% <= 1.623 milliseconds (cumulative count 87610) 282s 93.750% <= 1.711 milliseconds (cumulative count 94020) 282s 96.875% <= 1.783 milliseconds (cumulative count 96890) 282s 98.438% <= 1.847 milliseconds (cumulative count 98490) 282s 99.219% <= 1.919 milliseconds (cumulative count 99260) 282s 99.609% <= 1.991 milliseconds (cumulative count 99610) 282s 99.805% <= 2.023 milliseconds (cumulative count 99810) 282s 99.902% <= 2.143 milliseconds (cumulative count 99910) 282s 99.951% <= 2.295 milliseconds (cumulative count 99960) 282s 99.976% <= 2.367 milliseconds (cumulative count 99980) 282s 99.988% <= 2.391 milliseconds (cumulative count 99990) 282s 99.994% <= 2.415 milliseconds (cumulative count 100000) 282s 100.000% <= 2.415 milliseconds (cumulative count 100000) 282s 282s Cumulative distribution of latencies: 282s 0.000% <= 0.103 milliseconds (cumulative count 0) 282s 0.020% <= 0.407 milliseconds (cumulative count 20) 282s 0.120% <= 0.503 milliseconds (cumulative count 120) 282s 0.300% <= 0.607 milliseconds (cumulative count 300) 282s 0.460% <= 0.703 milliseconds (cumulative count 460) 282s 1.120% <= 0.807 milliseconds (cumulative count 1120) 282s 3.320% <= 0.903 milliseconds (cumulative count 3320) 282s 7.930% <= 1.007 milliseconds (cumulative count 7930) 282s 14.200% <= 1.103 milliseconds (cumulative count 14200) 282s 25.780% <= 1.207 milliseconds (cumulative count 25780) 282s 40.780% <= 1.303 milliseconds (cumulative count 40780) 282s 58.500% <= 1.407 milliseconds (cumulative count 58500) 282s 73.440% <= 1.503 milliseconds (cumulative count 73440) 282s 86.050% <= 1.607 milliseconds (cumulative count 86050) 282s 93.530% <= 1.703 milliseconds (cumulative count 93530) 282s 97.580% <= 1.807 milliseconds (cumulative count 97580) 282s 99.160% <= 1.903 milliseconds (cumulative count 99160) 282s 99.710% <= 2.007 milliseconds (cumulative count 99710) 282s 99.880% <= 2.103 milliseconds (cumulative count 99880) 282s 100.000% <= 3.103 milliseconds (cumulative count 100000) 282s 282s Summary: 282s throughput summary: 337837.84 requests per second 282s latency summary (msec): 282s avg min p50 p95 p99 max 282s 1.354 0.392 1.359 1.735 1.895 2.415 283s RPUSH: rps=117920.0 (overall: 409444.5) avg_msec=1.106 (overall: 1.106) ====== RPUSH ====== 283s 100000 requests completed in 0.24 seconds 283s 50 parallel clients 283s 3 bytes payload 283s keep alive: 1 283s host configuration "save": 3600 1 300 100 60 10000 283s host configuration "appendonly": no 283s multi-thread: no 283s 283s Latency by percentile distribution: 283s 0.000% <= 0.335 milliseconds (cumulative count 10) 283s 50.000% <= 1.111 milliseconds (cumulative count 50860) 283s 75.000% <= 1.239 milliseconds (cumulative count 76080) 283s 87.500% <= 1.319 milliseconds (cumulative count 87880) 283s 93.750% <= 1.375 milliseconds (cumulative count 93850) 283s 96.875% <= 1.431 milliseconds (cumulative count 96920) 283s 98.438% <= 1.487 milliseconds (cumulative count 98480) 283s 99.219% <= 1.543 milliseconds (cumulative count 99320) 283s 99.609% <= 1.591 milliseconds (cumulative count 99620) 283s 99.805% <= 1.647 milliseconds (cumulative count 99810) 283s 99.902% <= 1.783 milliseconds (cumulative count 99920) 283s 99.951% <= 1.831 milliseconds (cumulative count 99960) 283s 99.976% <= 1.855 milliseconds (cumulative count 99980) 283s 99.988% <= 1.863 milliseconds (cumulative count 99990) 283s 99.994% <= 2.007 milliseconds (cumulative count 100000) 283s 100.000% <= 2.007 milliseconds (cumulative count 100000) 283s 283s Cumulative distribution of latencies: 283s 0.000% <= 0.103 milliseconds (cumulative count 0) 283s 0.070% <= 0.407 milliseconds (cumulative count 70) 283s 0.120% <= 0.503 milliseconds (cumulative count 120) 283s 0.220% <= 0.607 milliseconds (cumulative count 220) 283s 0.670% <= 0.703 milliseconds (cumulative count 670) 283s 6.580% <= 0.807 milliseconds (cumulative count 6580) 283s 15.880% <= 0.903 milliseconds (cumulative count 15880) 283s 29.420% <= 1.007 milliseconds (cumulative count 29420) 283s 48.980% <= 1.103 milliseconds (cumulative count 48980) 283s 70.570% <= 1.207 milliseconds (cumulative count 70570) 283s 85.680% <= 1.303 milliseconds (cumulative count 85680) 283s 95.910% <= 1.407 milliseconds (cumulative count 95910) 283s 98.790% <= 1.503 milliseconds (cumulative count 98790) 283s 99.700% <= 1.607 milliseconds (cumulative count 99700) 283s 99.840% <= 1.703 milliseconds (cumulative count 99840) 283s 99.940% <= 1.807 milliseconds (cumulative count 99940) 283s 99.990% <= 1.903 milliseconds (cumulative count 99990) 283s 100.000% <= 2.007 milliseconds (cumulative count 100000) 283s 283s Summary: 283s throughput summary: 411522.62 requests per second 283s latency summary (msec): 283s avg min p50 p95 p99 max 283s 1.104 0.328 1.111 1.399 1.519 2.007 283s LPOP: rps=110079.7 (overall: 354230.8) avg_msec=1.285 (overall: 1.285) ====== LPOP ====== 283s 100000 requests completed in 0.28 seconds 283s 50 parallel clients 283s 3 bytes payload 283s keep alive: 1 283s host configuration "save": 3600 1 300 100 60 10000 283s host configuration "appendonly": no 283s multi-thread: no 283s 283s Latency by percentile distribution: 283s 0.000% <= 0.319 milliseconds (cumulative count 10) 283s 50.000% <= 1.303 milliseconds (cumulative count 50400) 283s 75.000% <= 1.439 milliseconds (cumulative count 75970) 283s 87.500% <= 1.519 milliseconds (cumulative count 87600) 283s 93.750% <= 1.583 milliseconds (cumulative count 93870) 283s 96.875% <= 1.639 milliseconds (cumulative count 96930) 283s 98.438% <= 1.687 milliseconds (cumulative count 98440) 283s 99.219% <= 1.735 milliseconds (cumulative count 99240) 283s 99.609% <= 1.791 milliseconds (cumulative count 99630) 283s 99.805% <= 1.839 milliseconds (cumulative count 99840) 283s 99.902% <= 1.871 milliseconds (cumulative count 99910) 283s 99.951% <= 1.911 milliseconds (cumulative count 99960) 283s 99.976% <= 1.943 milliseconds (cumulative count 99980) 283s 99.988% <= 1.959 milliseconds (cumulative count 99990) 283s 99.994% <= 1.999 milliseconds (cumulative count 100000) 283s 100.000% <= 1.999 milliseconds (cumulative count 100000) 283s 283s Cumulative distribution of latencies: 283s 0.000% <= 0.103 milliseconds (cumulative count 0) 283s 0.090% <= 0.407 milliseconds (cumulative count 90) 283s 0.170% <= 0.503 milliseconds (cumulative count 170) 283s 0.280% <= 0.607 milliseconds (cumulative count 280) 283s 0.290% <= 0.703 milliseconds (cumulative count 290) 283s 0.970% <= 0.807 milliseconds (cumulative count 970) 283s 5.780% <= 0.903 milliseconds (cumulative count 5780) 283s 13.820% <= 1.007 milliseconds (cumulative count 13820) 283s 18.620% <= 1.103 milliseconds (cumulative count 18620) 283s 31.730% <= 1.207 milliseconds (cumulative count 31730) 283s 50.400% <= 1.303 milliseconds (cumulative count 50400) 283s 70.610% <= 1.407 milliseconds (cumulative count 70610) 283s 85.480% <= 1.503 milliseconds (cumulative count 85480) 283s 95.380% <= 1.607 milliseconds (cumulative count 95380) 283s 98.760% <= 1.703 milliseconds (cumulative count 98760) 283s 99.700% <= 1.807 milliseconds (cumulative count 99700) 283s 99.950% <= 1.903 milliseconds (cumulative count 99950) 283s 100.000% <= 2.007 milliseconds (cumulative count 100000) 283s 283s Summary: 283s throughput summary: 358422.91 requests per second 283s latency summary (msec): 283s avg min p50 p95 p99 max 283s 1.284 0.312 1.303 1.607 1.719 1.999 283s RPOP: rps=71474.1 (overall: 373750.0) avg_msec=1.183 (overall: 1.183) ====== RPOP ====== 283s 100000 requests completed in 0.26 seconds 283s 50 parallel clients 283s 3 bytes payload 283s keep alive: 1 283s host configuration "save": 3600 1 300 100 60 10000 283s host configuration "appendonly": no 283s multi-thread: no 283s 283s Latency by percentile distribution: 283s 0.000% <= 0.303 milliseconds (cumulative count 10) 283s 50.000% <= 1.223 milliseconds (cumulative count 50630) 283s 75.000% <= 1.359 milliseconds (cumulative count 76260) 283s 87.500% <= 1.447 milliseconds (cumulative count 88290) 283s 93.750% <= 1.511 milliseconds (cumulative count 94160) 283s 96.875% <= 1.575 milliseconds (cumulative count 97020) 283s 98.438% <= 1.631 milliseconds (cumulative count 98510) 283s 99.219% <= 1.695 milliseconds (cumulative count 99270) 283s 99.609% <= 1.767 milliseconds (cumulative count 99630) 283s 99.805% <= 1.823 milliseconds (cumulative count 99810) 283s 99.902% <= 1.863 milliseconds (cumulative count 99920) 283s 99.951% <= 1.879 milliseconds (cumulative count 99960) 283s 99.976% <= 1.895 milliseconds (cumulative count 99980) 283s 99.988% <= 1.911 milliseconds (cumulative count 99990) 283s 99.994% <= 1.935 milliseconds (cumulative count 100000) 283s 100.000% <= 1.935 milliseconds (cumulative count 100000) 283s 283s Cumulative distribution of latencies: 283s 0.000% <= 0.103 milliseconds (cumulative count 0) 283s 0.010% <= 0.303 milliseconds (cumulative count 10) 283s 0.160% <= 0.407 milliseconds (cumulative count 160) 283s 0.320% <= 0.503 milliseconds (cumulative count 320) 283s 0.480% <= 0.607 milliseconds (cumulative count 480) 283s 0.800% <= 0.703 milliseconds (cumulative count 800) 283s 2.810% <= 0.807 milliseconds (cumulative count 2810) 283s 9.960% <= 0.903 milliseconds (cumulative count 9960) 283s 16.280% <= 1.007 milliseconds (cumulative count 16280) 283s 27.460% <= 1.103 milliseconds (cumulative count 27460) 283s 47.080% <= 1.207 milliseconds (cumulative count 47080) 283s 66.790% <= 1.303 milliseconds (cumulative count 66790) 283s 83.140% <= 1.407 milliseconds (cumulative count 83140) 283s 93.630% <= 1.503 milliseconds (cumulative count 93630) 283s 97.940% <= 1.607 milliseconds (cumulative count 97940) 283s 99.330% <= 1.703 milliseconds (cumulative count 99330) 283s 99.770% <= 1.807 milliseconds (cumulative count 99770) 283s 99.980% <= 1.903 milliseconds (cumulative count 99980) 283s 100.000% <= 2.007 milliseconds (cumulative count 100000) 283s 283s Summary: 283s throughput summary: 377358.50 requests per second 283s latency summary (msec): 283s avg min p50 p95 p99 max 283s 1.210 0.296 1.223 1.527 1.671 1.935 283s SADD: rps=54760.0 (overall: 441612.9) avg_msec=1.010 (overall: 1.010) ====== SADD ====== 283s 100000 requests completed in 0.23 seconds 283s 50 parallel clients 283s 3 bytes payload 283s keep alive: 1 283s host configuration "save": 3600 1 300 100 60 10000 283s host configuration "appendonly": no 283s multi-thread: no 283s 283s Latency by percentile distribution: 283s 0.000% <= 0.327 milliseconds (cumulative count 10) 283s 50.000% <= 1.047 milliseconds (cumulative count 51530) 283s 75.000% <= 1.175 milliseconds (cumulative count 75930) 283s 87.500% <= 1.263 milliseconds (cumulative count 88300) 283s 93.750% <= 1.327 milliseconds (cumulative count 93990) 283s 96.875% <= 1.423 milliseconds (cumulative count 96890) 283s 98.438% <= 1.727 milliseconds (cumulative count 98440) 283s 99.219% <= 1.943 milliseconds (cumulative count 99220) 283s 99.609% <= 2.111 milliseconds (cumulative count 99620) 283s 99.805% <= 2.207 milliseconds (cumulative count 99810) 283s 99.902% <= 2.335 milliseconds (cumulative count 99910) 283s 99.951% <= 2.455 milliseconds (cumulative count 99960) 283s 99.976% <= 2.551 milliseconds (cumulative count 99980) 283s 99.988% <= 2.687 milliseconds (cumulative count 99990) 283s 99.994% <= 2.783 milliseconds (cumulative count 100000) 283s 100.000% <= 2.783 milliseconds (cumulative count 100000) 283s 283s Cumulative distribution of latencies: 283s 0.000% <= 0.103 milliseconds (cumulative count 0) 283s 0.110% <= 0.407 milliseconds (cumulative count 110) 283s 0.280% <= 0.503 milliseconds (cumulative count 280) 283s 0.520% <= 0.607 milliseconds (cumulative count 520) 283s 1.870% <= 0.703 milliseconds (cumulative count 1870) 283s 11.870% <= 0.807 milliseconds (cumulative count 11870) 283s 23.520% <= 0.903 milliseconds (cumulative count 23520) 283s 42.840% <= 1.007 milliseconds (cumulative count 42840) 283s 63.340% <= 1.103 milliseconds (cumulative count 63340) 283s 80.790% <= 1.207 milliseconds (cumulative count 80790) 283s 92.370% <= 1.303 milliseconds (cumulative count 92370) 283s 96.630% <= 1.407 milliseconds (cumulative count 96630) 283s 97.590% <= 1.503 milliseconds (cumulative count 97590) 283s 98.060% <= 1.607 milliseconds (cumulative count 98060) 283s 98.340% <= 1.703 milliseconds (cumulative count 98340) 283s 98.710% <= 1.807 milliseconds (cumulative count 98710) 283s 99.060% <= 1.903 milliseconds (cumulative count 99060) 283s 99.400% <= 2.007 milliseconds (cumulative count 99400) 283s 99.600% <= 2.103 milliseconds (cumulative count 99600) 283s 100.000% <= 3.103 milliseconds (cumulative count 100000) 283s 283s Summary: 283s throughput summary: 429184.56 requests per second 283s latency summary (msec): 283s avg min p50 p95 p99 max 283s 1.052 0.320 1.047 1.351 1.887 2.783 284s HSET: rps=70637.5 (overall: 377234.1) avg_msec=1.190 (overall: 1.190) ====== HSET ====== 284s 100000 requests completed in 0.26 seconds 284s 50 parallel clients 284s 3 bytes payload 284s keep alive: 1 284s host configuration "save": 3600 1 300 100 60 10000 284s host configuration "appendonly": no 284s multi-thread: no 284s 284s Latency by percentile distribution: 284s 0.000% <= 0.407 milliseconds (cumulative count 10) 284s 50.000% <= 1.199 milliseconds (cumulative count 50170) 284s 75.000% <= 1.327 milliseconds (cumulative count 75790) 284s 87.500% <= 1.407 milliseconds (cumulative count 87910) 284s 93.750% <= 1.455 milliseconds (cumulative count 93770) 284s 96.875% <= 1.503 milliseconds (cumulative count 97250) 284s 98.438% <= 1.543 milliseconds (cumulative count 98500) 284s 99.219% <= 1.583 milliseconds (cumulative count 99230) 284s 99.609% <= 1.615 milliseconds (cumulative count 99610) 284s 99.805% <= 1.655 milliseconds (cumulative count 99830) 284s 99.902% <= 1.687 milliseconds (cumulative count 99920) 284s 99.951% <= 1.743 milliseconds (cumulative count 99960) 284s 99.976% <= 1.775 milliseconds (cumulative count 99980) 284s 99.988% <= 1.799 milliseconds (cumulative count 99990) 284s 99.994% <= 1.815 milliseconds (cumulative count 100000) 284s 100.000% <= 1.815 milliseconds (cumulative count 100000) 284s 284s Cumulative distribution of latencies: 284s 0.000% <= 0.103 milliseconds (cumulative count 0) 284s 0.010% <= 0.407 milliseconds (cumulative count 10) 284s 0.160% <= 0.503 milliseconds (cumulative count 160) 284s 0.270% <= 0.607 milliseconds (cumulative count 270) 284s 0.450% <= 0.703 milliseconds (cumulative count 450) 284s 2.370% <= 0.807 milliseconds (cumulative count 2370) 284s 11.550% <= 0.903 milliseconds (cumulative count 11550) 284s 17.880% <= 1.007 milliseconds (cumulative count 17880) 284s 30.340% <= 1.103 milliseconds (cumulative count 30340) 284s 52.010% <= 1.207 milliseconds (cumulative count 52010) 284s 71.640% <= 1.303 milliseconds (cumulative count 71640) 284s 87.910% <= 1.407 milliseconds (cumulative count 87910) 284s 97.250% <= 1.503 milliseconds (cumulative count 97250) 284s 99.510% <= 1.607 milliseconds (cumulative count 99510) 284s 99.930% <= 1.703 milliseconds (cumulative count 99930) 284s 99.990% <= 1.807 milliseconds (cumulative count 99990) 284s 100.000% <= 1.903 milliseconds (cumulative count 100000) 284s 284s Summary: 284s throughput summary: 386100.38 requests per second 284s latency summary (msec): 284s avg min p50 p95 p99 max 284s 1.185 0.400 1.199 1.471 1.575 1.815 284s SPOP: rps=69120.0 (overall: 467027.0) avg_msec=0.934 (overall: 0.934) ====== SPOP ====== 284s 100000 requests completed in 0.21 seconds 284s 50 parallel clients 284s 3 bytes payload 284s keep alive: 1 284s host configuration "save": 3600 1 300 100 60 10000 284s host configuration "appendonly": no 284s multi-thread: no 284s 284s Latency by percentile distribution: 284s 0.000% <= 0.311 milliseconds (cumulative count 10) 284s 50.000% <= 0.959 milliseconds (cumulative count 51350) 284s 75.000% <= 1.079 milliseconds (cumulative count 75440) 284s 87.500% <= 1.167 milliseconds (cumulative count 88670) 284s 93.750% <= 1.207 milliseconds (cumulative count 93900) 284s 96.875% <= 1.247 milliseconds (cumulative count 97030) 284s 98.438% <= 1.287 milliseconds (cumulative count 98490) 284s 99.219% <= 1.343 milliseconds (cumulative count 99280) 284s 99.609% <= 1.391 milliseconds (cumulative count 99620) 284s 99.805% <= 1.423 milliseconds (cumulative count 99810) 284s 99.902% <= 1.471 milliseconds (cumulative count 99910) 284s 99.951% <= 1.519 milliseconds (cumulative count 99960) 284s 99.976% <= 1.543 milliseconds (cumulative count 99980) 284s 99.988% <= 1.559 milliseconds (cumulative count 99990) 284s 99.994% <= 1.583 milliseconds (cumulative count 100000) 284s 100.000% <= 1.583 milliseconds (cumulative count 100000) 284s 284s Cumulative distribution of latencies: 284s 0.000% <= 0.103 milliseconds (cumulative count 0) 284s 0.170% <= 0.407 milliseconds (cumulative count 170) 284s 0.430% <= 0.503 milliseconds (cumulative count 430) 284s 0.800% <= 0.607 milliseconds (cumulative count 800) 284s 4.580% <= 0.703 milliseconds (cumulative count 4580) 284s 20.910% <= 0.807 milliseconds (cumulative count 20910) 284s 38.820% <= 0.903 milliseconds (cumulative count 38820) 284s 62.250% <= 1.007 milliseconds (cumulative count 62250) 284s 79.180% <= 1.103 milliseconds (cumulative count 79180) 284s 93.900% <= 1.207 milliseconds (cumulative count 93900) 284s 98.790% <= 1.303 milliseconds (cumulative count 98790) 284s 99.710% <= 1.407 milliseconds (cumulative count 99710) 284s 99.940% <= 1.503 milliseconds (cumulative count 99940) 284s 100.000% <= 1.607 milliseconds (cumulative count 100000) 284s 284s Summary: 284s throughput summary: 469483.56 requests per second 284s latency summary (msec): 284s avg min p50 p95 p99 max 284s 0.956 0.304 0.959 1.223 1.327 1.583 284s ZADD: rps=107051.8 (overall: 368082.2) avg_msec=1.221 (overall: 1.221) ====== ZADD ====== 284s 100000 requests completed in 0.27 seconds 284s 50 parallel clients 284s 3 bytes payload 284s keep alive: 1 284s host configuration "save": 3600 1 300 100 60 10000 284s host configuration "appendonly": no 284s multi-thread: no 284s 284s Latency by percentile distribution: 284s 0.000% <= 0.327 milliseconds (cumulative count 10) 284s 50.000% <= 1.271 milliseconds (cumulative count 51240) 284s 75.000% <= 1.407 milliseconds (cumulative count 75950) 284s 87.500% <= 1.495 milliseconds (cumulative count 88140) 284s 93.750% <= 1.559 milliseconds (cumulative count 94270) 284s 96.875% <= 1.615 milliseconds (cumulative count 97100) 284s 98.438% <= 1.671 milliseconds (cumulative count 98520) 284s 99.219% <= 1.751 milliseconds (cumulative count 99250) 284s 99.609% <= 1.887 milliseconds (cumulative count 99610) 284s 99.805% <= 1.975 milliseconds (cumulative count 99810) 284s 99.902% <= 2.055 milliseconds (cumulative count 99910) 284s 99.951% <= 2.111 milliseconds (cumulative count 99960) 284s 99.976% <= 2.127 milliseconds (cumulative count 99980) 284s 99.988% <= 2.151 milliseconds (cumulative count 99990) 284s 99.994% <= 2.183 milliseconds (cumulative count 100000) 284s 100.000% <= 2.183 milliseconds (cumulative count 100000) 284s 284s Cumulative distribution of latencies: 284s 0.000% <= 0.103 milliseconds (cumulative count 0) 284s 0.170% <= 0.407 milliseconds (cumulative count 170) 284s 0.320% <= 0.503 milliseconds (cumulative count 320) 284s 0.460% <= 0.607 milliseconds (cumulative count 460) 284s 0.710% <= 0.703 milliseconds (cumulative count 710) 284s 1.250% <= 0.807 milliseconds (cumulative count 1250) 284s 7.890% <= 0.903 milliseconds (cumulative count 7890) 284s 15.420% <= 1.007 milliseconds (cumulative count 15420) 284s 22.460% <= 1.103 milliseconds (cumulative count 22460) 284s 38.180% <= 1.207 milliseconds (cumulative count 38180) 284s 57.920% <= 1.303 milliseconds (cumulative count 57920) 284s 75.950% <= 1.407 milliseconds (cumulative count 75950) 284s 89.030% <= 1.503 milliseconds (cumulative count 89030) 284s 96.780% <= 1.607 milliseconds (cumulative count 96780) 284s 98.950% <= 1.703 milliseconds (cumulative count 98950) 284s 99.420% <= 1.807 milliseconds (cumulative count 99420) 284s 99.650% <= 1.903 milliseconds (cumulative count 99650) 284s 99.860% <= 2.007 milliseconds (cumulative count 99860) 284s 99.950% <= 2.103 milliseconds (cumulative count 99950) 284s 100.000% <= 3.103 milliseconds (cumulative count 100000) 284s 284s Summary: 284s throughput summary: 366300.38 requests per second 284s latency summary (msec): 284s avg min p50 p95 p99 max 284s 1.252 0.320 1.271 1.575 1.711 2.183 284s ZPOPMIN: rps=86280.0 (overall: 449375.0) avg_msec=0.991 (overall: 0.991) ====== ZPOPMIN ====== 284s 100000 requests completed in 0.22 seconds 284s 50 parallel clients 284s 3 bytes payload 284s keep alive: 1 284s host configuration "save": 3600 1 300 100 60 10000 284s host configuration "appendonly": no 284s multi-thread: no 284s 284s Latency by percentile distribution: 284s 0.000% <= 0.335 milliseconds (cumulative count 10) 284s 50.000% <= 0.975 milliseconds (cumulative count 50160) 284s 75.000% <= 1.111 milliseconds (cumulative count 75690) 284s 87.500% <= 1.199 milliseconds (cumulative count 88150) 284s 93.750% <= 1.263 milliseconds (cumulative count 93930) 284s 96.875% <= 1.319 milliseconds (cumulative count 97010) 284s 98.438% <= 1.367 milliseconds (cumulative count 98570) 284s 99.219% <= 1.415 milliseconds (cumulative count 99260) 284s 99.609% <= 1.463 milliseconds (cumulative count 99630) 284s 99.805% <= 1.511 milliseconds (cumulative count 99810) 284s 99.902% <= 1.575 milliseconds (cumulative count 99910) 284s 99.951% <= 1.615 milliseconds (cumulative count 99970) 284s 99.976% <= 1.631 milliseconds (cumulative count 99980) 284s 99.988% <= 1.639 milliseconds (cumulative count 99990) 284s 99.994% <= 1.695 milliseconds (cumulative count 100000) 284s 100.000% <= 1.695 milliseconds (cumulative count 100000) 284s 284s Cumulative distribution of latencies: 284s 0.000% <= 0.103 milliseconds (cumulative count 0) 284s 0.140% <= 0.407 milliseconds (cumulative count 140) 284s 0.320% <= 0.503 milliseconds (cumulative count 320) 284s 0.870% <= 0.607 milliseconds (cumulative count 870) 284s 3.870% <= 0.703 milliseconds (cumulative count 3870) 284s 17.620% <= 0.807 milliseconds (cumulative count 17620) 284s 35.410% <= 0.903 milliseconds (cumulative count 35410) 284s 56.940% <= 1.007 milliseconds (cumulative count 56940) 284s 74.540% <= 1.103 milliseconds (cumulative count 74540) 284s 88.990% <= 1.207 milliseconds (cumulative count 88990) 284s 96.340% <= 1.303 milliseconds (cumulative count 96340) 284s 99.180% <= 1.407 milliseconds (cumulative count 99180) 284s 99.800% <= 1.503 milliseconds (cumulative count 99800) 284s 99.950% <= 1.607 milliseconds (cumulative count 99950) 284s 100.000% <= 1.703 milliseconds (cumulative count 100000) 284s 284s Summary: 284s throughput summary: 458715.59 requests per second 284s latency summary (msec): 284s avg min p50 p95 p99 max 284s 0.982 0.328 0.975 1.287 1.391 1.695 285s LPUSH (needed to benchmark LRANGE): rps=106613.6 (overall: 343076.9) avg_msec=1.318 (overall: 1.318) ====== LPUSH (needed to benchmark LRANGE) ====== 285s 100000 requests completed in 0.29 seconds 285s 50 parallel clients 285s 3 bytes payload 285s keep alive: 1 285s host configuration "save": 3600 1 300 100 60 10000 285s host configuration "appendonly": no 285s multi-thread: no 285s 285s Latency by percentile distribution: 285s 0.000% <= 0.463 milliseconds (cumulative count 10) 285s 50.000% <= 1.343 milliseconds (cumulative count 50150) 285s 75.000% <= 1.495 milliseconds (cumulative count 75790) 285s 87.500% <= 1.591 milliseconds (cumulative count 87600) 285s 93.750% <= 1.671 milliseconds (cumulative count 93940) 285s 96.875% <= 1.735 milliseconds (cumulative count 97160) 285s 98.438% <= 1.783 milliseconds (cumulative count 98460) 285s 99.219% <= 1.839 milliseconds (cumulative count 99270) 285s 99.609% <= 1.887 milliseconds (cumulative count 99660) 285s 99.805% <= 1.927 milliseconds (cumulative count 99810) 285s 99.902% <= 1.983 milliseconds (cumulative count 99910) 285s 99.951% <= 2.087 milliseconds (cumulative count 99970) 285s 99.976% <= 2.103 milliseconds (cumulative count 99980) 285s 99.988% <= 2.135 milliseconds (cumulative count 99990) 285s 99.994% <= 2.159 milliseconds (cumulative count 100000) 285s 100.000% <= 2.159 milliseconds (cumulative count 100000) 285s 285s Cumulative distribution of latencies: 285s 0.000% <= 0.103 milliseconds (cumulative count 0) 285s 0.030% <= 0.503 milliseconds (cumulative count 30) 285s 0.130% <= 0.607 milliseconds (cumulative count 130) 285s 0.290% <= 0.703 milliseconds (cumulative count 290) 285s 1.010% <= 0.807 milliseconds (cumulative count 1010) 285s 4.030% <= 0.903 milliseconds (cumulative count 4030) 285s 9.960% <= 1.007 milliseconds (cumulative count 9960) 285s 16.890% <= 1.103 milliseconds (cumulative count 16890) 285s 28.200% <= 1.207 milliseconds (cumulative count 28200) 285s 43.250% <= 1.303 milliseconds (cumulative count 43250) 285s 61.480% <= 1.407 milliseconds (cumulative count 61480) 285s 76.870% <= 1.503 milliseconds (cumulative count 76870) 285s 89.010% <= 1.607 milliseconds (cumulative count 89010) 285s 95.910% <= 1.703 milliseconds (cumulative count 95910) 285s 98.820% <= 1.807 milliseconds (cumulative count 98820) 285s 99.750% <= 1.903 milliseconds (cumulative count 99750) 285s 99.930% <= 2.007 milliseconds (cumulative count 99930) 285s 99.980% <= 2.103 milliseconds (cumulative count 99980) 285s 100.000% <= 3.103 milliseconds (cumulative count 100000) 285s 285s Summary: 285s throughput summary: 343642.59 requests per second 285s latency summary (msec): 285s avg min p50 p95 p99 max 285s 1.331 0.456 1.343 1.695 1.823 2.159 286s LRANGE_100 (first 100 elements): rps=10278.9 (overall: 73714.3) avg_msec=4.054 (overall: 4.054) LRANGE_100 (first 100 elements): rps=83067.7 (overall: 81923.1) avg_msec=3.067 (overall: 3.175) LRANGE_100 (first 100 elements): rps=83147.4 (overall: 82495.3) avg_msec=3.047 (overall: 3.115) LRANGE_100 (first 100 elements): rps=83769.8 (overall: 82902.4) avg_msec=3.039 (overall: 3.090) LRANGE_100 (first 100 elements): rps=83849.2 (overall: 83131.6) avg_msec=3.013 (overall: 3.071) ====== LRANGE_100 (first 100 elements) ====== 286s 100000 requests completed in 1.20 seconds 286s 50 parallel clients 286s 3 bytes payload 286s keep alive: 1 286s host configuration "save": 3600 1 300 100 60 10000 286s host configuration "appendonly": no 286s multi-thread: no 286s 286s Latency by percentile distribution: 286s 0.000% <= 0.535 milliseconds (cumulative count 10) 286s 50.000% <= 3.031 milliseconds (cumulative count 51650) 286s 75.000% <= 3.127 milliseconds (cumulative count 75120) 286s 87.500% <= 3.215 milliseconds (cumulative count 87610) 286s 93.750% <= 3.319 milliseconds (cumulative count 93960) 286s 96.875% <= 3.463 milliseconds (cumulative count 96990) 286s 98.438% <= 3.631 milliseconds (cumulative count 98440) 286s 99.219% <= 4.119 milliseconds (cumulative count 99220) 286s 99.609% <= 5.703 milliseconds (cumulative count 99610) 286s 99.805% <= 6.927 milliseconds (cumulative count 99810) 286s 99.902% <= 7.799 milliseconds (cumulative count 99910) 286s 99.951% <= 8.487 milliseconds (cumulative count 99960) 286s 99.976% <= 8.695 milliseconds (cumulative count 99980) 286s 99.988% <= 8.815 milliseconds (cumulative count 99990) 286s 99.994% <= 8.951 milliseconds (cumulative count 100000) 286s 100.000% <= 8.951 milliseconds (cumulative count 100000) 286s 286s Cumulative distribution of latencies: 286s 0.000% <= 0.103 milliseconds (cumulative count 0) 286s 0.010% <= 0.607 milliseconds (cumulative count 10) 286s 0.020% <= 1.903 milliseconds (cumulative count 20) 286s 0.030% <= 2.103 milliseconds (cumulative count 30) 286s 70.660% <= 3.103 milliseconds (cumulative count 70660) 286s 99.210% <= 4.103 milliseconds (cumulative count 99210) 286s 99.490% <= 5.103 milliseconds (cumulative count 99490) 286s 99.670% <= 6.103 milliseconds (cumulative count 99670) 286s 99.830% <= 7.103 milliseconds (cumulative count 99830) 286s 99.930% <= 8.103 milliseconds (cumulative count 99930) 286s 100.000% <= 9.103 milliseconds (cumulative count 100000) 286s 286s Summary: 286s throughput summary: 83333.33 requests per second 286s latency summary (msec): 286s avg min p50 p95 p99 max 286s 3.065 0.528 3.031 3.351 3.847 8.951 290s LRANGE_300 (first 300 elements): rps=7110.2 (overall: 19419.4) avg_msec=14.886 (overall: 14.886) LRANGE_300 (first 300 elements): rps=24155.0 (overall: 22900.3) avg_msec=9.938 (overall: 11.050) LRANGE_300 (first 300 elements): rps=23833.3 (overall: 23290.2) avg_msec=11.077 (overall: 11.061) LRANGE_300 (first 300 elements): rps=24313.7 (overall: 23594.4) avg_msec=10.385 (overall: 10.854) LRANGE_300 (first 300 elements): rps=20796.8 (overall: 22961.2) avg_msec=13.987 (overall: 11.496) LRANGE_300 (first 300 elements): rps=24454.9 (overall: 23240.5) avg_msec=9.838 (overall: 11.170) LRANGE_300 (first 300 elements): rps=24335.9 (overall: 23413.6) avg_msec=9.996 (overall: 10.977) LRANGE_300 (first 300 elements): rps=23665.3 (overall: 23447.4) avg_msec=10.302 (overall: 10.886) LRANGE_300 (first 300 elements): rps=22777.8 (overall: 23367.9) avg_msec=11.941 (overall: 11.008) LRANGE_300 (first 300 elements): rps=21668.0 (overall: 23186.9) avg_msec=13.059 (overall: 11.212) LRANGE_300 (first 300 elements): rps=23410.9 (overall: 23208.8) avg_msec=11.060 (overall: 11.197) LRANGE_300 (first 300 elements): rps=22110.7 (overall: 23112.6) avg_msec=12.739 (overall: 11.326) LRANGE_300 (first 300 elements): rps=25865.1 (overall: 23333.5) avg_msec=9.607 (overall: 11.173) LRANGE_300 (first 300 elements): rps=22627.0 (overall: 23281.0) avg_msec=12.363 (overall: 11.259) LRANGE_300 (first 300 elements): rps=25781.2 (overall: 23456.5) avg_msec=9.677 (overall: 11.137) LRANGE_300 (first 300 elements): rps=24608.0 (overall: 23530.4) avg_msec=11.358 (overall: 11.152) LRANGE_300 (first 300 elements): rps=26023.9 (overall: 23681.3) avg_msec=9.680 (overall: 11.054) ====== LRANGE_300 (first 300 elements) ====== 290s 100000 requests completed in 4.22 seconds 290s 50 parallel clients 290s 3 bytes payload 290s keep alive: 1 290s host configuration "save": 3600 1 300 100 60 10000 290s host configuration "appendonly": no 290s multi-thread: no 290s 290s Latency by percentile distribution: 290s 0.000% <= 0.455 milliseconds (cumulative count 10) 290s 50.000% <= 10.223 milliseconds (cumulative count 50040) 290s 75.000% <= 12.575 milliseconds (cumulative count 75010) 290s 87.500% <= 15.439 milliseconds (cumulative count 87500) 290s 93.750% <= 18.415 milliseconds (cumulative count 93760) 290s 96.875% <= 20.655 milliseconds (cumulative count 96880) 290s 98.438% <= 22.655 milliseconds (cumulative count 98440) 290s 99.219% <= 24.831 milliseconds (cumulative count 99220) 290s 99.609% <= 27.103 milliseconds (cumulative count 99610) 290s 99.805% <= 28.015 milliseconds (cumulative count 99810) 290s 99.902% <= 28.559 milliseconds (cumulative count 99910) 290s 99.951% <= 29.007 milliseconds (cumulative count 99970) 290s 99.976% <= 29.231 milliseconds (cumulative count 99980) 290s 99.988% <= 29.455 milliseconds (cumulative count 99990) 290s 99.994% <= 29.679 milliseconds (cumulative count 100000) 290s 100.000% <= 29.679 milliseconds (cumulative count 100000) 290s 290s Cumulative distribution of latencies: 290s 0.000% <= 0.103 milliseconds (cumulative count 0) 290s 0.010% <= 0.503 milliseconds (cumulative count 10) 290s 0.020% <= 0.703 milliseconds (cumulative count 20) 290s 0.030% <= 0.807 milliseconds (cumulative count 30) 290s 0.070% <= 0.903 milliseconds (cumulative count 70) 290s 0.090% <= 1.007 milliseconds (cumulative count 90) 290s 0.120% <= 1.207 milliseconds (cumulative count 120) 290s 0.200% <= 1.303 milliseconds (cumulative count 200) 290s 0.260% <= 1.503 milliseconds (cumulative count 260) 290s 0.330% <= 1.607 milliseconds (cumulative count 330) 290s 0.380% <= 1.703 milliseconds (cumulative count 380) 290s 0.440% <= 1.807 milliseconds (cumulative count 440) 290s 0.510% <= 1.903 milliseconds (cumulative count 510) 290s 0.580% <= 2.007 milliseconds (cumulative count 580) 290s 0.610% <= 2.103 milliseconds (cumulative count 610) 290s 0.920% <= 3.103 milliseconds (cumulative count 920) 290s 1.470% <= 4.103 milliseconds (cumulative count 1470) 290s 2.420% <= 5.103 milliseconds (cumulative count 2420) 290s 5.110% <= 6.103 milliseconds (cumulative count 5110) 290s 10.010% <= 7.103 milliseconds (cumulative count 10010) 290s 19.540% <= 8.103 milliseconds (cumulative count 19540) 290s 32.870% <= 9.103 milliseconds (cumulative count 32870) 290s 48.140% <= 10.103 milliseconds (cumulative count 48140) 290s 61.960% <= 11.103 milliseconds (cumulative count 61960) 290s 71.730% <= 12.103 milliseconds (cumulative count 71730) 290s 77.970% <= 13.103 milliseconds (cumulative count 77970) 290s 82.750% <= 14.103 milliseconds (cumulative count 82750) 290s 86.430% <= 15.103 milliseconds (cumulative count 86430) 290s 89.370% <= 16.103 milliseconds (cumulative count 89370) 290s 91.440% <= 17.103 milliseconds (cumulative count 91440) 290s 93.210% <= 18.111 milliseconds (cumulative count 93210) 290s 94.870% <= 19.103 milliseconds (cumulative count 94870) 290s 96.240% <= 20.111 milliseconds (cumulative count 96240) 290s 97.220% <= 21.103 milliseconds (cumulative count 97220) 290s 98.080% <= 22.111 milliseconds (cumulative count 98080) 290s 98.630% <= 23.103 milliseconds (cumulative count 98630) 290s 99.010% <= 24.111 milliseconds (cumulative count 99010) 290s 99.300% <= 25.103 milliseconds (cumulative count 99300) 290s 99.450% <= 26.111 milliseconds (cumulative count 99450) 290s 99.610% <= 27.103 milliseconds (cumulative count 99610) 290s 99.840% <= 28.111 milliseconds (cumulative count 99840) 290s 99.970% <= 29.103 milliseconds (cumulative count 99970) 290s 100.000% <= 30.111 milliseconds (cumulative count 100000) 290s 290s Summary: 290s throughput summary: 23719.16 requests per second 290s latency summary (msec): 290s avg min p50 p95 p99 max 290s 11.021 0.448 10.223 19.183 24.079 29.679 298s LRANGE_500 (first 500 elements): rps=6733.1 (overall: 9337.0) avg_msec=26.876 (overall: 26.876) LRANGE_500 (first 500 elements): rps=10745.1 (overall: 10160.6) avg_msec=27.263 (overall: 27.116) LRANGE_500 (first 500 elements): rps=13114.6 (overall: 11245.3) avg_msec=19.674 (overall: 23.929) LRANGE_500 (first 500 elements): rps=14332.0 (overall: 12088.6) avg_msec=17.547 (overall: 21.862) LRANGE_500 (first 500 elements): rps=14445.7 (overall: 12592.9) avg_msec=16.986 (overall: 20.665) LRANGE_500 (first 500 elements): rps=15039.4 (overall: 13018.5) avg_msec=13.412 (overall: 19.207) LRANGE_500 (first 500 elements): rps=15008.0 (overall: 13310.3) avg_msec=14.472 (overall: 18.424) LRANGE_500 (first 500 elements): rps=14476.2 (overall: 13460.0) avg_msec=16.341 (overall: 18.136) LRANGE_500 (first 500 elements): rps=14517.9 (overall: 13579.9) avg_msec=17.042 (overall: 18.004) LRANGE_500 (first 500 elements): rps=13940.9 (overall: 13617.1) avg_msec=18.636 (overall: 18.070) LRANGE_500 (first 500 elements): rps=12597.6 (overall: 13523.0) avg_msec=19.304 (overall: 18.176) LRANGE_500 (first 500 elements): rps=13071.7 (overall: 13484.8) avg_msec=18.438 (overall: 18.198) LRANGE_500 (first 500 elements): rps=12283.5 (overall: 13390.2) avg_msec=19.153 (overall: 18.267) LRANGE_500 (first 500 elements): rps=11857.7 (overall: 13278.7) avg_msec=20.573 (overall: 18.417) LRANGE_500 (first 500 elements): rps=10597.6 (overall: 13098.2) avg_msec=22.593 (overall: 18.644) LRANGE_500 (first 500 elements): rps=11893.7 (overall: 13021.3) avg_msec=20.169 (overall: 18.733) LRANGE_500 (first 500 elements): rps=13338.6 (overall: 13040.4) avg_msec=19.413 (overall: 18.775) LRANGE_500 (first 500 elements): rps=11897.2 (overall: 12975.9) avg_msec=21.791 (overall: 18.931) LRANGE_500 (first 500 elements): rps=14848.6 (overall: 13075.1) avg_msec=17.780 (overall: 18.861) LRANGE_500 (first 500 elements): rps=11094.9 (overall: 12974.8) avg_msec=23.121 (overall: 19.046) LRANGE_500 (first 500 elements): rps=12298.8 (overall: 12942.4) avg_msec=22.405 (overall: 19.199) LRANGE_500 (first 500 elements): rps=10773.8 (overall: 12843.0) avg_msec=26.896 (overall: 19.495) LRANGE_500 (first 500 elements): rps=10120.6 (overall: 12721.4) avg_msec=25.930 (overall: 19.724) LRANGE_500 (first 500 elements): rps=10175.3 (overall: 12614.9) avg_msec=26.828 (overall: 19.963) LRANGE_500 (first 500 elements): rps=10803.9 (overall: 12541.1) avg_msec=27.392 (overall: 20.224) LRANGE_500 (first 500 elements): rps=14544.7 (overall: 12620.2) avg_msec=14.961 (overall: 19.985) LRANGE_500 (first 500 elements): rps=15168.0 (overall: 12716.5) avg_msec=13.701 (overall: 19.701) LRANGE_500 (first 500 elements): rps=15171.9 (overall: 12805.9) avg_msec=14.164 (overall: 19.462) LRANGE_500 (first 500 elements): rps=11003.9 (overall: 12743.1) avg_msec=22.658 (overall: 19.559) LRANGE_500 (first 500 elements): rps=10776.9 (overall: 12677.6) avg_msec=26.665 (overall: 19.760) LRANGE_500 (first 500 elements): rps=10046.0 (overall: 12589.4) avg_msec=27.294 (overall: 19.961) ====== LRANGE_500 (first 500 elements) ====== 298s 100000 requests completed in 7.97 seconds 298s 50 parallel clients 298s 3 bytes payload 298s keep alive: 1 298s host configuration "save": 3600 1 300 100 60 10000 298s host configuration "appendonly": no 298s multi-thread: no 298s 298s Latency by percentile distribution: 298s 0.000% <= 0.607 milliseconds (cumulative count 10) 298s 50.000% <= 19.327 milliseconds (cumulative count 50000) 298s 75.000% <= 25.775 milliseconds (cumulative count 75000) 298s 87.500% <= 31.775 milliseconds (cumulative count 87500) 298s 93.750% <= 35.007 milliseconds (cumulative count 93750) 298s 96.875% <= 36.927 milliseconds (cumulative count 96890) 298s 98.438% <= 38.111 milliseconds (cumulative count 98450) 298s 99.219% <= 39.391 milliseconds (cumulative count 99230) 298s 99.609% <= 40.447 milliseconds (cumulative count 99620) 298s 99.805% <= 41.151 milliseconds (cumulative count 99810) 298s 99.902% <= 41.663 milliseconds (cumulative count 99910) 298s 99.951% <= 42.079 milliseconds (cumulative count 99970) 298s 99.976% <= 42.239 milliseconds (cumulative count 99980) 298s 99.988% <= 42.303 milliseconds (cumulative count 99990) 298s 99.994% <= 42.751 milliseconds (cumulative count 100000) 298s 100.000% <= 42.751 milliseconds (cumulative count 100000) 298s 298s Cumulative distribution of latencies: 298s 0.000% <= 0.103 milliseconds (cumulative count 0) 298s 0.010% <= 0.607 milliseconds (cumulative count 10) 298s 0.020% <= 1.103 milliseconds (cumulative count 20) 298s 0.030% <= 1.207 milliseconds (cumulative count 30) 298s 0.040% <= 1.303 milliseconds (cumulative count 40) 298s 0.050% <= 1.407 milliseconds (cumulative count 50) 298s 0.070% <= 1.503 milliseconds (cumulative count 70) 298s 0.080% <= 1.607 milliseconds (cumulative count 80) 298s 0.100% <= 1.703 milliseconds (cumulative count 100) 298s 0.200% <= 1.807 milliseconds (cumulative count 200) 298s 0.300% <= 1.903 milliseconds (cumulative count 300) 298s 0.450% <= 2.007 milliseconds (cumulative count 450) 298s 0.510% <= 2.103 milliseconds (cumulative count 510) 298s 1.720% <= 3.103 milliseconds (cumulative count 1720) 298s 2.490% <= 4.103 milliseconds (cumulative count 2490) 298s 3.430% <= 5.103 milliseconds (cumulative count 3430) 298s 4.270% <= 6.103 milliseconds (cumulative count 4270) 298s 5.420% <= 7.103 milliseconds (cumulative count 5420) 298s 6.880% <= 8.103 milliseconds (cumulative count 6880) 298s 8.470% <= 9.103 milliseconds (cumulative count 8470) 298s 10.580% <= 10.103 milliseconds (cumulative count 10580) 298s 13.840% <= 11.103 milliseconds (cumulative count 13840) 298s 18.210% <= 12.103 milliseconds (cumulative count 18210) 298s 22.790% <= 13.103 milliseconds (cumulative count 22790) 298s 27.970% <= 14.103 milliseconds (cumulative count 27970) 298s 32.950% <= 15.103 milliseconds (cumulative count 32950) 298s 37.350% <= 16.103 milliseconds (cumulative count 37350) 298s 41.400% <= 17.103 milliseconds (cumulative count 41400) 298s 45.410% <= 18.111 milliseconds (cumulative count 45410) 298s 49.170% <= 19.103 milliseconds (cumulative count 49170) 298s 52.840% <= 20.111 milliseconds (cumulative count 52840) 298s 56.730% <= 21.103 milliseconds (cumulative count 56730) 298s 60.210% <= 22.111 milliseconds (cumulative count 60210) 298s 64.420% <= 23.103 milliseconds (cumulative count 64420) 298s 68.780% <= 24.111 milliseconds (cumulative count 68780) 298s 72.650% <= 25.103 milliseconds (cumulative count 72650) 298s 76.070% <= 26.111 milliseconds (cumulative count 76070) 298s 79.000% <= 27.103 milliseconds (cumulative count 79000) 298s 81.400% <= 28.111 milliseconds (cumulative count 81400) 298s 83.470% <= 29.103 milliseconds (cumulative count 83470) 298s 85.090% <= 30.111 milliseconds (cumulative count 85090) 298s 86.460% <= 31.103 milliseconds (cumulative count 86460) 298s 88.070% <= 32.111 milliseconds (cumulative count 88070) 298s 89.760% <= 33.119 milliseconds (cumulative count 89760) 298s 91.710% <= 34.111 milliseconds (cumulative count 91710) 298s 93.960% <= 35.103 milliseconds (cumulative count 93960) 298s 95.750% <= 36.127 milliseconds (cumulative count 95750) 298s 97.200% <= 37.119 milliseconds (cumulative count 97200) 298s 98.450% <= 38.111 milliseconds (cumulative count 98450) 298s 99.150% <= 39.103 milliseconds (cumulative count 99150) 298s 99.450% <= 40.127 milliseconds (cumulative count 99450) 298s 99.790% <= 41.119 milliseconds (cumulative count 99790) 298s 99.970% <= 42.111 milliseconds (cumulative count 99970) 298s 100.000% <= 43.103 milliseconds (cumulative count 100000) 298s 298s Summary: 298s throughput summary: 12545.48 requests per second 298s latency summary (msec): 298s avg min p50 p95 p99 max 298s 20.072 0.600 19.327 35.647 38.719 42.751 309s LRANGE_600 (first 600 elements): rps=2332.0 (overall: 8328.6) avg_msec=30.571 (overall: 30.571) LRANGE_600 (first 600 elements): rps=8852.6 (overall: 8738.3) avg_msec=29.499 (overall: 29.722) LRANGE_600 (first 600 elements): rps=10868.5 (overall: 9673.1) avg_msec=21.561 (overall: 25.698) LRANGE_600 (first 600 elements): rps=10745.0 (overall: 10000.0) avg_msec=23.933 (overall: 25.120) LRANGE_600 (first 600 elements): rps=11075.7 (overall: 10251.4) avg_msec=20.810 (overall: 24.032) LRANGE_600 (first 600 elements): rps=8541.8 (overall: 9927.5) avg_msec=31.932 (overall: 25.319) LRANGE_600 (first 600 elements): rps=8410.2 (overall: 9681.8) avg_msec=30.673 (overall: 26.072) LRANGE_600 (first 600 elements): rps=8257.9 (overall: 9486.1) avg_msec=30.973 (overall: 26.659) LRANGE_600 (first 600 elements): rps=8131.5 (overall: 9322.9) avg_msec=30.864 (overall: 27.101) LRANGE_600 (first 600 elements): rps=8724.0 (overall: 9258.8) avg_msec=31.164 (overall: 27.511) LRANGE_600 (first 600 elements): rps=8669.3 (overall: 9201.5) avg_msec=31.516 (overall: 27.877) LRANGE_600 (first 600 elements): rps=10099.2 (overall: 9281.3) avg_msec=23.769 (overall: 27.480) LRANGE_600 (first 600 elements): rps=9683.8 (overall: 9314.2) avg_msec=25.188 (overall: 27.285) LRANGE_600 (first 600 elements): rps=8761.0 (overall: 9272.7) avg_msec=29.359 (overall: 27.432) LRANGE_600 (first 600 elements): rps=8589.6 (overall: 9224.9) avg_msec=31.385 (overall: 27.689) LRANGE_600 (first 600 elements): rps=8318.7 (overall: 9165.8) avg_msec=30.437 (overall: 27.852) LRANGE_600 (first 600 elements): rps=8181.8 (overall: 9105.0) avg_msec=31.231 (overall: 28.040) LRANGE_600 (first 600 elements): rps=8260.9 (overall: 9055.9) avg_msec=30.946 (overall: 28.194) LRANGE_600 (first 600 elements): rps=8589.6 (overall: 9030.4) avg_msec=31.277 (overall: 28.354) LRANGE_600 (first 600 elements): rps=8800.0 (overall: 9018.6) avg_msec=31.360 (overall: 28.505) LRANGE_600 (first 600 elements): rps=9430.8 (overall: 9039.0) avg_msec=26.974 (overall: 28.426) LRANGE_600 (first 600 elements): rps=8691.7 (overall: 9022.6) avg_msec=30.336 (overall: 28.513) LRANGE_600 (first 600 elements): rps=9324.1 (overall: 9036.2) avg_msec=26.442 (overall: 28.416) LRANGE_600 (first 600 elements): rps=11099.6 (overall: 9124.6) avg_msec=19.744 (overall: 27.965) LRANGE_600 (first 600 elements): rps=8231.4 (overall: 9087.3) avg_msec=32.214 (overall: 28.125) LRANGE_600 (first 600 elements): rps=8881.9 (overall: 9079.1) avg_msec=26.629 (overall: 28.067) LRANGE_600 (first 600 elements): rps=8059.5 (overall: 9040.3) avg_msec=31.189 (overall: 28.173) LRANGE_600 (first 600 elements): rps=8342.6 (overall: 9014.8) avg_msec=31.902 (overall: 28.299) LRANGE_600 (first 600 elements): rps=9840.6 (overall: 9043.9) avg_msec=26.732 (overall: 28.239) LRANGE_600 (first 600 elements): rps=10199.2 (overall: 9083.3) avg_msec=21.438 (overall: 27.979) LRANGE_600 (first 600 elements): rps=8565.7 (overall: 9066.2) avg_msec=30.997 (overall: 28.073) LRANGE_600 (first 600 elements): rps=10921.3 (overall: 9126.0) avg_msec=21.728 (overall: 27.828) LRANGE_600 (first 600 elements): rps=9613.9 (overall: 9141.6) avg_msec=24.450 (overall: 27.715) LRANGE_600 (first 600 elements): rps=8666.7 (overall: 9127.1) avg_msec=29.899 (overall: 27.778) LRANGE_600 (first 600 elements): rps=8467.2 (overall: 9107.4) avg_msec=28.740 (overall: 27.805) LRANGE_600 (first 600 elements): rps=8537.8 (overall: 9091.3) avg_msec=30.041 (overall: 27.864) LRANGE_600 (first 600 elements): rps=8664.0 (overall: 9079.6) avg_msec=32.295 (overall: 27.979) LRANGE_600 (first 600 elements): rps=8248.0 (overall: 9057.2) avg_msec=31.359 (overall: 28.062) LRANGE_600 (first 600 elements): rps=8130.4 (overall: 9032.9) avg_msec=31.147 (overall: 28.135) LRANGE_600 (first 600 elements): rps=8260.9 (overall: 9013.2) avg_msec=30.842 (overall: 28.198) LRANGE_600 (first 600 elements): rps=8460.3 (overall: 8999.5) avg_msec=31.572 (overall: 28.277) LRANGE_600 (first 600 elements): rps=8804.8 (overall: 8994.8) avg_msec=31.375 (overall: 28.350) LRANGE_600 (first 600 elements): rps=8336.0 (overall: 8979.2) avg_msec=31.629 (overall: 28.422) LRANGE_600 (first 600 elements): rps=8302.3 (overall: 8963.2) avg_msec=30.497 (overall: 28.468) ====== LRANGE_600 (first 600 elements) ====== 309s 100000 requests completed in 11.14 seconds 309s 50 parallel clients 309s 3 bytes payload 309s keep alive: 1 309s host configuration "save": 3600 1 300 100 60 10000 309s host configuration "appendonly": no 309s multi-thread: no 309s 309s Latency by percentile distribution: 309s 0.000% <= 0.951 milliseconds (cumulative count 10) 309s 50.000% <= 29.743 milliseconds (cumulative count 50030) 309s 75.000% <= 37.823 milliseconds (cumulative count 75110) 309s 87.500% <= 40.671 milliseconds (cumulative count 87510) 309s 93.750% <= 42.335 milliseconds (cumulative count 93780) 309s 96.875% <= 43.583 milliseconds (cumulative count 96910) 309s 98.438% <= 44.799 milliseconds (cumulative count 98440) 309s 99.219% <= 45.663 milliseconds (cumulative count 99220) 309s 99.609% <= 46.335 milliseconds (cumulative count 99610) 309s 99.805% <= 47.071 milliseconds (cumulative count 99810) 309s 99.902% <= 47.487 milliseconds (cumulative count 99910) 309s 99.951% <= 47.839 milliseconds (cumulative count 99960) 309s 99.976% <= 47.967 milliseconds (cumulative count 99980) 309s 99.988% <= 48.287 milliseconds (cumulative count 99990) 309s 99.994% <= 48.543 milliseconds (cumulative count 100000) 309s 100.000% <= 48.543 milliseconds (cumulative count 100000) 309s 309s Cumulative distribution of latencies: 309s 0.000% <= 0.103 milliseconds (cumulative count 0) 309s 0.010% <= 1.007 milliseconds (cumulative count 10) 309s 0.020% <= 1.103 milliseconds (cumulative count 20) 309s 0.040% <= 1.303 milliseconds (cumulative count 40) 309s 0.070% <= 1.407 milliseconds (cumulative count 70) 309s 0.110% <= 1.503 milliseconds (cumulative count 110) 309s 0.130% <= 1.607 milliseconds (cumulative count 130) 309s 0.210% <= 1.703 milliseconds (cumulative count 210) 309s 0.400% <= 1.807 milliseconds (cumulative count 400) 309s 0.500% <= 1.903 milliseconds (cumulative count 500) 309s 0.750% <= 2.007 milliseconds (cumulative count 750) 309s 0.920% <= 2.103 milliseconds (cumulative count 920) 309s 5.130% <= 3.103 milliseconds (cumulative count 5130) 309s 6.590% <= 4.103 milliseconds (cumulative count 6590) 309s 6.990% <= 5.103 milliseconds (cumulative count 6990) 309s 7.290% <= 6.103 milliseconds (cumulative count 7290) 309s 7.550% <= 7.103 milliseconds (cumulative count 7550) 309s 7.980% <= 8.103 milliseconds (cumulative count 7980) 309s 8.320% <= 9.103 milliseconds (cumulative count 8320) 309s 8.810% <= 10.103 milliseconds (cumulative count 8810) 309s 9.280% <= 11.103 milliseconds (cumulative count 9280) 309s 9.940% <= 12.103 milliseconds (cumulative count 9940) 309s 10.860% <= 13.103 milliseconds (cumulative count 10860) 309s 12.290% <= 14.103 milliseconds (cumulative count 12290) 309s 13.900% <= 15.103 milliseconds (cumulative count 13900) 309s 15.340% <= 16.103 milliseconds (cumulative count 15340) 309s 16.440% <= 17.103 milliseconds (cumulative count 16440) 309s 17.750% <= 18.111 milliseconds (cumulative count 17750) 309s 18.880% <= 19.103 milliseconds (cumulative count 18880) 309s 20.500% <= 20.111 milliseconds (cumulative count 20500) 309s 22.280% <= 21.103 milliseconds (cumulative count 22280) 309s 23.640% <= 22.111 milliseconds (cumulative count 23640) 309s 25.000% <= 23.103 milliseconds (cumulative count 25000) 309s 26.430% <= 24.111 milliseconds (cumulative count 26430) 309s 29.570% <= 25.103 milliseconds (cumulative count 29570) 309s 32.260% <= 26.111 milliseconds (cumulative count 32260) 309s 36.740% <= 27.103 milliseconds (cumulative count 36740) 309s 41.950% <= 28.111 milliseconds (cumulative count 41950) 309s 47.010% <= 29.103 milliseconds (cumulative count 47010) 309s 51.750% <= 30.111 milliseconds (cumulative count 51750) 309s 56.680% <= 31.103 milliseconds (cumulative count 56680) 309s 61.000% <= 32.111 milliseconds (cumulative count 61000) 309s 64.030% <= 33.119 milliseconds (cumulative count 64030) 309s 66.540% <= 34.111 milliseconds (cumulative count 66540) 309s 69.020% <= 35.103 milliseconds (cumulative count 69020) 309s 71.260% <= 36.127 milliseconds (cumulative count 71260) 309s 73.230% <= 37.119 milliseconds (cumulative count 73230) 309s 76.120% <= 38.111 milliseconds (cumulative count 76120) 309s 80.460% <= 39.103 milliseconds (cumulative count 80460) 309s 85.130% <= 40.127 milliseconds (cumulative count 85130) 309s 89.310% <= 41.119 milliseconds (cumulative count 89310) 309s 92.980% <= 42.111 milliseconds (cumulative count 92980) 309s 95.930% <= 43.103 milliseconds (cumulative count 95930) 309s 97.630% <= 44.127 milliseconds (cumulative count 97630) 309s 98.790% <= 45.119 milliseconds (cumulative count 98790) 309s 99.520% <= 46.111 milliseconds (cumulative count 99520) 309s 99.810% <= 47.103 milliseconds (cumulative count 99810) 309s 99.980% <= 48.127 milliseconds (cumulative count 99980) 309s 100.000% <= 49.119 milliseconds (cumulative count 100000) 309s 309s Summary: 309s throughput summary: 8979.89 requests per second 309s latency summary (msec): 309s avg min p50 p95 p99 max 309s 28.423 0.944 29.743 42.751 45.343 48.543 310s MSET (10 keys): rps=24761.9 (overall: 152195.1) avg_msec=2.955 (overall: 2.955) MSET (10 keys): rps=159400.0 (overall: 158384.9) avg_msec=2.991 (overall: 2.986) MSET (10 keys): rps=159800.8 (overall: 159040.6) avg_msec=2.987 (overall: 2.986) ====== MSET (10 keys) ====== 310s 100000 requests completed in 0.63 seconds 310s 50 parallel clients 310s 3 bytes payload 310s keep alive: 1 310s host configuration "save": 3600 1 300 100 60 10000 310s host configuration "appendonly": no 310s multi-thread: no 310s 310s Latency by percentile distribution: 310s 0.000% <= 0.551 milliseconds (cumulative count 10) 310s 50.000% <= 3.023 milliseconds (cumulative count 50150) 310s 75.000% <= 3.191 milliseconds (cumulative count 75400) 310s 87.500% <= 3.295 milliseconds (cumulative count 87600) 310s 93.750% <= 3.383 milliseconds (cumulative count 94050) 310s 96.875% <= 3.463 milliseconds (cumulative count 97130) 310s 98.438% <= 3.543 milliseconds (cumulative count 98540) 310s 99.219% <= 3.639 milliseconds (cumulative count 99260) 310s 99.609% <= 3.719 milliseconds (cumulative count 99630) 310s 99.805% <= 3.791 milliseconds (cumulative count 99810) 310s 99.902% <= 3.871 milliseconds (cumulative count 99910) 310s 99.951% <= 3.951 milliseconds (cumulative count 99970) 310s 99.976% <= 3.983 milliseconds (cumulative count 99980) 310s 99.988% <= 3.999 milliseconds (cumulative count 99990) 310s 99.994% <= 4.023 milliseconds (cumulative count 100000) 310s 100.000% <= 4.023 milliseconds (cumulative count 100000) 310s 310s Cumulative distribution of latencies: 310s 0.000% <= 0.103 milliseconds (cumulative count 0) 310s 0.010% <= 0.607 milliseconds (cumulative count 10) 310s 0.040% <= 0.903 milliseconds (cumulative count 40) 310s 0.060% <= 1.007 milliseconds (cumulative count 60) 310s 0.090% <= 1.103 milliseconds (cumulative count 90) 310s 0.150% <= 1.207 milliseconds (cumulative count 150) 310s 0.190% <= 1.303 milliseconds (cumulative count 190) 310s 0.350% <= 1.607 milliseconds (cumulative count 350) 310s 0.670% <= 1.703 milliseconds (cumulative count 670) 310s 2.150% <= 1.807 milliseconds (cumulative count 2150) 310s 3.640% <= 1.903 milliseconds (cumulative count 3640) 310s 4.830% <= 2.007 milliseconds (cumulative count 4830) 310s 5.180% <= 2.103 milliseconds (cumulative count 5180) 310s 62.710% <= 3.103 milliseconds (cumulative count 62710) 310s 100.000% <= 4.103 milliseconds (cumulative count 100000) 310s 310s Summary: 310s throughput summary: 159235.66 requests per second 310s latency summary (msec): 310s avg min p50 p95 p99 max 310s 2.985 0.544 3.023 3.407 3.599 4.023 310s 310s autopkgtest [16:57:39]: test 0002-benchmark: -----------------------] 314s autopkgtest [16:57:43]: test 0002-benchmark: - - - - - - - - - - results - - - - - - - - - - 314s 0002-benchmark PASS 318s autopkgtest [16:57:47]: test 0003-redis-check-aof: preparing testbed 320s Reading package lists... 320s Building dependency tree... 320s Reading state information... 320s Starting pkgProblemResolver with broken count: 0 320s Starting 2 pkgProblemResolver with broken count: 0 320s Done 321s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 334s autopkgtest [16:58:03]: test 0003-redis-check-aof: [----------------------- 336s autopkgtest [16:58:05]: test 0003-redis-check-aof: -----------------------] 340s 0003-redis-check-aof PASS 340s autopkgtest [16:58:09]: test 0003-redis-check-aof: - - - - - - - - - - results - - - - - - - - - - 344s autopkgtest [16:58:13]: test 0004-redis-check-rdb: preparing testbed 346s Reading package lists... 346s Building dependency tree... 346s Reading state information... 346s Starting pkgProblemResolver with broken count: 0 346s Starting 2 pkgProblemResolver with broken count: 0 346s Done 347s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 360s autopkgtest [16:58:29]: test 0004-redis-check-rdb: [----------------------- 367s OK 367s [offset 0] Checking RDB file /var/lib/redis/dump.rdb 367s [offset 27] AUX FIELD redis-ver = '7.0.15' 367s [offset 41] AUX FIELD redis-bits = '32' 367s [offset 53] AUX FIELD ctime = '1742057916' 367s [offset 68] AUX FIELD used-mem = '1392152' 367s [offset 80] AUX FIELD aof-base = '0' 367s [offset 82] Selecting DB ID 0 367s [offset 7184] Checksum OK 367s [offset 7184] \o/ RDB looks OK! \o/ 367s [info] 4 keys read 367s [info] 0 expires 367s [info] 0 already expired 368s autopkgtest [16:58:37]: test 0004-redis-check-rdb: -----------------------] 371s autopkgtest [16:58:40]: test 0004-redis-check-rdb: - - - - - - - - - - results - - - - - - - - - - 371s 0004-redis-check-rdb PASS 375s autopkgtest [16:58:44]: test 0005-cjson: preparing testbed 377s Reading package lists... 377s Building dependency tree... 377s Reading state information... 377s Starting pkgProblemResolver with broken count: 0 378s Starting 2 pkgProblemResolver with broken count: 0 378s Done 378s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 392s autopkgtest [16:59:01]: test 0005-cjson: [----------------------- 399s 399s autopkgtest [16:59:08]: test 0005-cjson: -----------------------] 403s autopkgtest [16:59:12]: test 0005-cjson: - - - - - - - - - - results - - - - - - - - - - 403s 0005-cjson PASS 407s autopkgtest [16:59:16]: @@@@@@@@@@@@@@@@@@@@ summary 407s 0001-redis-cli PASS 407s 0002-benchmark PASS 407s 0003-redis-check-aof PASS 407s 0004-redis-check-rdb PASS 407s 0005-cjson PASS