0s autopkgtest [18:15:54]: starting date and time: 2025-03-15 18:15:54+0000 0s autopkgtest [18:15:54]: git checkout: 325255d2 Merge branch 'pin-any-arch' into 'ubuntu/production' 0s autopkgtest [18:15:54]: host juju-7f2275-prod-proposed-migration-environment-9; command line: /home/ubuntu/autopkgtest/runner/autopkgtest --output-dir /tmp/autopkgtest-work.p8029use/out --timeout-copy=6000 --setup-commands 'ln -s /dev/null /etc/systemd/system/bluetooth.service; printf "http_proxy=http://squid.internal:3128\nhttps_proxy=http://squid.internal:3128\nno_proxy=127.0.0.1,127.0.1.1,login.ubuntu.com,localhost,localdomain,novalocal,internal,archive.ubuntu.com,ports.ubuntu.com,security.ubuntu.com,ddebs.ubuntu.com,changelogs.ubuntu.com,keyserver.ubuntu.com,launchpadlibrarian.net,launchpadcontent.net,launchpad.net,10.24.0.0/24,keystone.ps5.canonical.com,objectstorage.prodstack5.canonical.com,radosgw.ps5.canonical.com\n" >> /etc/environment' --apt-pocket=proposed=src:glibc --apt-upgrade valkey --timeout-short=300 --timeout-copy=20000 --timeout-build=20000 --env=ADT_TEST_TRIGGERS=glibc/2.41-1ubuntu2 -- lxd -r lxd-armhf-10.145.243.227 lxd-armhf-10.145.243.227:autopkgtest/ubuntu/plucky/armhf 20s autopkgtest [18:16:14]: testbed dpkg architecture: armhf 22s autopkgtest [18:16:16]: testbed apt version: 2.9.33 26s autopkgtest [18:16:20]: @@@@@@@@@@@@@@@@@@@@ test bed setup 27s autopkgtest [18:16:21]: testbed release detected to be: None 35s autopkgtest [18:16:29]: updating testbed package index (apt update) 37s Get:1 http://ftpmaster.internal/ubuntu plucky-proposed InRelease [126 kB] 37s Get:2 http://ftpmaster.internal/ubuntu plucky InRelease [257 kB] 37s Get:3 http://ftpmaster.internal/ubuntu plucky-updates InRelease [126 kB] 38s Get:4 http://ftpmaster.internal/ubuntu plucky-security InRelease [126 kB] 38s Get:5 http://ftpmaster.internal/ubuntu plucky-proposed/universe Sources [379 kB] 38s Get:6 http://ftpmaster.internal/ubuntu plucky-proposed/multiverse Sources [15.8 kB] 38s Get:7 http://ftpmaster.internal/ubuntu plucky-proposed/main Sources [99.7 kB] 38s Get:8 http://ftpmaster.internal/ubuntu plucky-proposed/main armhf Packages [114 kB] 38s Get:9 http://ftpmaster.internal/ubuntu plucky-proposed/main armhf c-n-f Metadata [1832 B] 38s Get:10 http://ftpmaster.internal/ubuntu plucky-proposed/restricted armhf c-n-f Metadata [116 B] 38s Get:11 http://ftpmaster.internal/ubuntu plucky-proposed/universe armhf Packages [312 kB] 39s Get:12 http://ftpmaster.internal/ubuntu plucky-proposed/universe armhf c-n-f Metadata [11.1 kB] 39s Get:13 http://ftpmaster.internal/ubuntu plucky-proposed/multiverse armhf Packages [3472 B] 39s Get:14 http://ftpmaster.internal/ubuntu plucky-proposed/multiverse armhf c-n-f Metadata [240 B] 39s Get:15 http://ftpmaster.internal/ubuntu plucky/main Sources [1394 kB] 40s Get:16 http://ftpmaster.internal/ubuntu plucky/multiverse Sources [299 kB] 40s Get:17 http://ftpmaster.internal/ubuntu plucky/universe Sources [21.0 MB] 55s Get:18 http://ftpmaster.internal/ubuntu plucky/main armhf Packages [1378 kB] 56s Get:19 http://ftpmaster.internal/ubuntu plucky/main armhf c-n-f Metadata [29.4 kB] 56s Get:20 http://ftpmaster.internal/ubuntu plucky/restricted armhf c-n-f Metadata [108 B] 56s Get:21 http://ftpmaster.internal/ubuntu plucky/universe armhf Packages [15.1 MB] 68s Get:22 http://ftpmaster.internal/ubuntu plucky/multiverse armhf Packages [172 kB] 70s Fetched 41.0 MB in 33s (1257 kB/s) 71s Reading package lists... 76s autopkgtest [18:17:10]: upgrading testbed (apt dist-upgrade and autopurge) 78s Reading package lists... 78s Building dependency tree... 78s Reading state information... 79s Calculating upgrade...Starting pkgProblemResolver with broken count: 0 79s Starting 2 pkgProblemResolver with broken count: 0 79s Done 80s Entering ResolveByKeep 80s 80s Calculating upgrade... 81s The following packages will be upgraded: 81s libc-bin libc6 locales pinentry-curses python3-jinja2 sos strace 81s 7 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 81s Need to get 8683 kB of archives. 81s After this operation, 23.6 kB of additional disk space will be used. 81s Get:1 http://ftpmaster.internal/ubuntu plucky-proposed/main armhf libc6 armhf 2.41-1ubuntu2 [2932 kB] 83s Get:2 http://ftpmaster.internal/ubuntu plucky-proposed/main armhf libc-bin armhf 2.41-1ubuntu2 [545 kB] 84s Get:3 http://ftpmaster.internal/ubuntu plucky-proposed/main armhf locales all 2.41-1ubuntu2 [4246 kB] 87s Get:4 http://ftpmaster.internal/ubuntu plucky/main armhf strace armhf 6.13+ds-1ubuntu1 [445 kB] 88s Get:5 http://ftpmaster.internal/ubuntu plucky/main armhf pinentry-curses armhf 1.3.1-2ubuntu3 [40.6 kB] 88s Get:6 http://ftpmaster.internal/ubuntu plucky/main armhf python3-jinja2 all 3.1.5-2ubuntu1 [109 kB] 88s Get:7 http://ftpmaster.internal/ubuntu plucky/main armhf sos all 4.9.0-5 [365 kB] 89s Preconfiguring packages ... 89s Fetched 8683 kB in 8s (1153 kB/s) 89s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 64655 files and directories currently installed.) 89s Preparing to unpack .../libc6_2.41-1ubuntu2_armhf.deb ... 89s Unpacking libc6:armhf (2.41-1ubuntu2) over (2.41-1ubuntu1) ... 89s Setting up libc6:armhf (2.41-1ubuntu2) ... 89s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 64655 files and directories currently installed.) 89s Preparing to unpack .../libc-bin_2.41-1ubuntu2_armhf.deb ... 89s Unpacking libc-bin (2.41-1ubuntu2) over (2.41-1ubuntu1) ... 90s Setting up libc-bin (2.41-1ubuntu2) ... 90s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 64655 files and directories currently installed.) 90s Preparing to unpack .../locales_2.41-1ubuntu2_all.deb ... 90s Unpacking locales (2.41-1ubuntu2) over (2.41-1ubuntu1) ... 90s Preparing to unpack .../strace_6.13+ds-1ubuntu1_armhf.deb ... 90s Unpacking strace (6.13+ds-1ubuntu1) over (6.11-0ubuntu1) ... 90s Preparing to unpack .../pinentry-curses_1.3.1-2ubuntu3_armhf.deb ... 90s Unpacking pinentry-curses (1.3.1-2ubuntu3) over (1.3.1-2ubuntu2) ... 90s Preparing to unpack .../python3-jinja2_3.1.5-2ubuntu1_all.deb ... 90s Unpacking python3-jinja2 (3.1.5-2ubuntu1) over (3.1.5-2) ... 90s Preparing to unpack .../archives/sos_4.9.0-5_all.deb ... 90s Unpacking sos (4.9.0-5) over (4.9.0-4) ... 91s Setting up sos (4.9.0-5) ... 91s Setting up pinentry-curses (1.3.1-2ubuntu3) ... 91s Setting up locales (2.41-1ubuntu2) ... 92s Generating locales (this might take a while)... 94s en_US.UTF-8... done 94s Generation complete. 94s Setting up python3-jinja2 (3.1.5-2ubuntu1) ... 94s Setting up strace (6.13+ds-1ubuntu1) ... 94s Processing triggers for man-db (2.13.0-1) ... 95s Processing triggers for systemd (257.3-1ubuntu3) ... 98s Reading package lists... 98s Building dependency tree... 98s Reading state information... 98s Starting pkgProblemResolver with broken count: 0 98s Starting 2 pkgProblemResolver with broken count: 0 98s Done 99s Solving dependencies... 99s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 102s autopkgtest [18:17:36]: rebooting testbed after setup commands that affected boot 142s autopkgtest [18:18:16]: testbed running kernel: Linux 6.8.0-52-generic #53~22.04.1-Ubuntu SMP PREEMPT_DYNAMIC Wed Jan 15 18:10:51 UTC 2 168s autopkgtest [18:18:41]: @@@@@@@@@@@@@@@@@@@@ apt-source valkey 191s Get:1 http://ftpmaster.internal/ubuntu plucky/universe valkey 8.0.2+dfsg1-1ubuntu1 (dsc) [2484 B] 191s Get:2 http://ftpmaster.internal/ubuntu plucky/universe valkey 8.0.2+dfsg1-1ubuntu1 (tar) [2599 kB] 191s Get:3 http://ftpmaster.internal/ubuntu plucky/universe valkey 8.0.2+dfsg1-1ubuntu1 (diff) [18.1 kB] 191s gpgv: Signature made Wed Feb 12 14:50:45 2025 UTC 191s gpgv: using RSA key 63EEFC3DE14D5146CE7F24BF34B8AD7D9529E793 191s gpgv: issuer "lena.voytek@canonical.com" 191s gpgv: Can't check signature: No public key 191s dpkg-source: warning: cannot verify inline signature for ./valkey_8.0.2+dfsg1-1ubuntu1.dsc: no acceptable signature found 191s autopkgtest [18:19:05]: testing package valkey version 8.0.2+dfsg1-1ubuntu1 194s autopkgtest [18:19:08]: build not needed 198s autopkgtest [18:19:12]: test 0001-valkey-cli: preparing testbed 200s Reading package lists... 200s Building dependency tree... 200s Reading state information... 201s Starting pkgProblemResolver with broken count: 0 201s Starting 2 pkgProblemResolver with broken count: 0 201s Done 201s The following NEW packages will be installed: 201s liblzf1 valkey-server valkey-tools 202s 0 upgraded, 3 newly installed, 0 to remove and 0 not upgraded. 202s Need to get 1225 kB of archives. 202s After this operation, 4955 kB of additional disk space will be used. 202s Get:1 http://ftpmaster.internal/ubuntu plucky/universe armhf liblzf1 armhf 3.6-4 [6554 B] 202s Get:2 http://ftpmaster.internal/ubuntu plucky/universe armhf valkey-tools armhf 8.0.2+dfsg1-1ubuntu1 [1170 kB] 203s Get:3 http://ftpmaster.internal/ubuntu plucky/universe armhf valkey-server armhf 8.0.2+dfsg1-1ubuntu1 [48.5 kB] 203s Fetched 1225 kB in 1s (1129 kB/s) 203s Selecting previously unselected package liblzf1:armhf. 203s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 64655 files and directories currently installed.) 203s Preparing to unpack .../liblzf1_3.6-4_armhf.deb ... 203s Unpacking liblzf1:armhf (3.6-4) ... 203s Selecting previously unselected package valkey-tools. 203s Preparing to unpack .../valkey-tools_8.0.2+dfsg1-1ubuntu1_armhf.deb ... 203s Unpacking valkey-tools (8.0.2+dfsg1-1ubuntu1) ... 203s Selecting previously unselected package valkey-server. 203s Preparing to unpack .../valkey-server_8.0.2+dfsg1-1ubuntu1_armhf.deb ... 203s Unpacking valkey-server (8.0.2+dfsg1-1ubuntu1) ... 203s Setting up liblzf1:armhf (3.6-4) ... 203s Setting up valkey-tools (8.0.2+dfsg1-1ubuntu1) ... 203s Setting up valkey-server (8.0.2+dfsg1-1ubuntu1) ... 204s Created symlink '/etc/systemd/system/valkey.service' → '/usr/lib/systemd/system/valkey-server.service'. 204s Created symlink '/etc/systemd/system/multi-user.target.wants/valkey-server.service' → '/usr/lib/systemd/system/valkey-server.service'. 204s Processing triggers for man-db (2.13.0-1) ... 204s Processing triggers for libc-bin (2.41-1ubuntu2) ... 213s autopkgtest [18:19:27]: test 0001-valkey-cli: [----------------------- 220s # Server 220s redis_version:7.2.4 220s server_name:valkey 220s valkey_version:8.0.2 220s redis_git_sha1:00000000 220s redis_git_dirty:0 220s redis_build_id:5fe77b42c48a3400 220s server_mode:standalone 220s os:Linux 6.8.0-52-generic armv7l 220s arch_bits:32 220s monotonic_clock:POSIX clock_gettime 220s multiplexing_api:epoll 220s gcc_version:14.2.0 220s process_id:1075 220s process_supervised:systemd 220s run_id:d3983c50a8ad429fad4737aeabfb19794400c431 220s tcp_port:6379 220s server_time_usec:1742062774390357 220s uptime_in_seconds:5 220s uptime_in_days:0 220s hz:10 220s configured_hz:10 220s lru_clock:14009526 220s executable:/usr/bin/valkey-server 220s config_file:/etc/valkey/valkey.conf 220s io_threads_active:0 220s availability_zone: 220s listener0:name=tcp,bind=127.0.0.1,bind=-::1,port=6379 220s 220s # Clients 220s connected_clients:1 220s cluster_connections:0 220s maxclients:10000 220s client_recent_max_input_buffer:0 220s client_recent_max_output_buffer:0 220s blocked_clients:0 220s tracking_clients:0 220s pubsub_clients:0 220s watching_clients:0 220s clients_in_timeout_table:0 220s total_watched_keys:0 220s total_blocking_keys:0 220s total_blocking_keys_on_nokey:0 220s 220s # Memory 220s used_memory:771832 220s used_memory_human:753.74K 220s used_memory_rss:9961472 220s used_memory_rss_human:9.50M 220s used_memory_peak:771832 220s used_memory_peak_human:753.74K 220s used_memory_peak_perc:100.34% 220s used_memory_overhead:751880 220s used_memory_startup:751760 220s used_memory_dataset:19952 220s used_memory_dataset_perc:99.40% 220s allocator_allocated:3988736 220s allocator_active:9502720 220s allocator_resident:10289152 220s allocator_muzzy:0 220s total_system_memory:3844116480 220s total_system_memory_human:3.58G 220s used_memory_lua:23552 220s used_memory_vm_eval:23552 220s used_memory_lua_human:23.00K 220s used_memory_scripts_eval:0 220s number_of_cached_scripts:0 220s number_of_functions:0 220s number_of_libraries:0 220s used_memory_vm_functions:24576 220s used_memory_vm_total:48128 220s used_memory_vm_total_human:47.00K 220s used_memory_functions:120 220s used_memory_scripts:120 220s used_memory_scripts_human:120B 220s maxmemory:3221225472 220s maxmemory_human:3.00G 220s maxmemory_policy:noeviction 220s allocator_frag_ratio:2.38 220s allocator_frag_bytes:5513984 220s allocator_rss_ratio:1.08 220s allocator_rss_bytes:786432 220s rss_overhead_ratio:0.97 220s rss_overhead_bytes:-327680 220s mem_fragmentation_ratio:13.25 220s mem_fragmentation_bytes:9209624 220s mem_not_counted_for_evict:0 220s mem_replication_backlog:0 220s mem_total_replication_buffers:0 220s mem_clients_slaves:0 220s mem_clients_normal:0 220s mem_cluster_links:0 220s mem_aof_buffer:0 220s mem_allocator:jemalloc-5.3.0 220s mem_overhead_db_hashtable_rehashing:0 220s active_defrag_running:0 220s lazyfree_pending_objects:0 220s lazyfreed_objects:0 220s 220s # Persistence 220s loading:0 220s async_loading:0 220s current_cow_peak:0 220s current_cow_size:0 220s current_cow_size_age:0 220s current_fork_perc:0.00 220s current_save_keys_processed:0 220s current_save_keys_total:0 220s rdb_changes_since_last_save:0 220s rdb_bgsave_in_progress:0 220s rdb_last_save_time:1742062769 220s rdb_last_bgsave_status:ok 220s rdb_last_bgsave_time_sec:-1 220s rdb_current_bgsave_time_sec:-1 220s rdb_saves:0 220s rdb_last_cow_size:0 220s rdb_last_load_keys_expired:0 220s rdb_last_load_keys_loaded:0 220s aof_enabled:0 220s aof_rewrite_in_progress:0 220s aof_rewrite_scheduled:0 220s aof_last_rewrite_time_sec:-1 220s aof_current_rewrite_time_sec:-1 220s aof_last_bgrewrite_status:ok 220s aof_rewrites:0 220s aof_rewrites_consecutive_failures:0 220s aof_last_write_status:ok 220s aof_last_cow_size:0 220s module_fork_in_progress:0 220s module_fork_last_cow_size:0 220s 220s # Stats 220s total_connections_received:1 220s total_commands_processed:0 220s instantaneous_ops_per_sec:0 220s total_net_input_bytes:14 220s total_net_output_bytes:0 220s total_net_repl_input_bytes:0 220s total_net_repl_output_bytes:0 220s instantaneous_input_kbps:0.00 220s instantaneous_output_kbps:0.00 220s instantaneous_input_repl_kbps:0.00 220s instantaneous_output_repl_kbps:0.00 220s rejected_connections:0 220s sync_full:0 220s sync_partial_ok:0 220s sync_partial_err:0 220s expired_keys:0 220s expired_stale_perc:0.00 220s expired_time_cap_reached_count:0 220s expire_cycle_cpu_milliseconds:0 220s evicted_keys:0 220s evicted_clients:0 220s evicted_scripts:0 220s total_eviction_exceeded_time:0 220s current_eviction_exceeded_time:0 220s keyspace_hits:0 220s keyspace_misses:0 220s pubsub_channels:0 220s pubsub_patterns:0 220s pubsubshard_channels:0 220s latest_fork_usec:0 220s total_forks:0 220s migrate_cached_sockets:0 220s slave_expires_tracked_keys:0 220s active_defrag_hits:0 220s active_defrag_misses:0 220s active_defrag_key_hits:0 220s active_defrag_key_misses:0 220s total_active_defrag_time:0 220s current_active_defrag_time:0 220s tracking_total_keys:0 220s tracking_total_items:0 220s tracking_total_prefixes:0 220s unexpected_error_replies:0 220s total_error_replies:0 220s dump_payload_sanitizations:0 220s total_reads_processed:1 220s total_writes_processed:0 220s io_threaded_reads_processed:0 220s io_threaded_writes_processed:0 220s io_threaded_freed_objects:0 220s io_threaded_poll_processed:0 220s io_threaded_total_prefetch_batches:0 220s io_threaded_total_prefetch_entries:0 220s client_query_buffer_limit_disconnections:0 220s client_output_buffer_limit_disconnections:0 220s reply_buffer_shrinks:0 220s reply_buffer_expands:0 220s eventloop_cycles:51 220s eventloop_duration_sum:19918 220s eventloop_duration_cmd_sum:0 220s instantaneous_eventloop_cycles_per_sec:9 220s instantaneous_eventloop_duration_usec:445 220s acl_access_denied_auth:0 220s acl_access_denied_cmd:0 220s acl_access_denied_key:0 220s acl_access_denied_channel:0 220s 220s # Replication 220s role:master 220s connected_slaves:0 220s replicas_waiting_psync:0 220s master_failover_state:no-failover 220s master_replid:bba99a83ab19fd2f6f6085e5edce82a0fd7e617f 220s master_replid2:0000000000000000000000000000000000000000 220s master_repl_offset:0 220s second_repl_offset:-1 220s repl_backlog_active:0 220s repl_backlog_size:10485760 220s repl_backlog_first_byte_offset:0 220s repl_backlog_histlen:0 220s 220s # CPU 220s used_cpu_sys:0.039259 220s used_cpu_user:0.067028 220s used_cpu_sys_children:0.000817 220s used_cpu_user_children:0.000449 220s used_cpu_sys_main_thread:0.038453 220s used_cpu_user_main_thread:0.067293 220s 220s # Modules 220s 220s # Errorstats 220s 220s # Cluster 220s cluster_enabled:0 220s 220s # Keyspace 220s Redis ver. 8.0.2 220s autopkgtest [18:19:34]: test 0001-valkey-cli: -----------------------] 230s 0001-valkey-cli PASS 230s autopkgtest [18:19:44]: test 0001-valkey-cli: - - - - - - - - - - results - - - - - - - - - - 236s autopkgtest [18:19:50]: test 0002-benchmark: preparing testbed 238s Reading package lists... 239s Building dependency tree... 239s Reading state information... 239s Starting pkgProblemResolver with broken count: 0 239s Starting 2 pkgProblemResolver with broken count: 0 239s Done 240s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 248s autopkgtest [18:20:02]: test 0002-benchmark: [----------------------- 256s PING_INLINE: rps=0.0 (overall: nan) avg_msec=nan (overall: nan) PING_INLINE: rps=348446.2 (overall: 348446.2) avg_msec=1.217 (overall: 1.217) ====== PING_INLINE ====== 256s 100000 requests completed in 0.29 seconds 256s 50 parallel clients 256s 3 bytes payload 256s keep alive: 1 256s host configuration "save": 3600 1 300 100 60 10000 256s host configuration "appendonly": no 256s multi-thread: no 256s 256s Latency by percentile distribution: 256s 0.000% <= 0.487 milliseconds (cumulative count 10) 256s 50.000% <= 1.183 milliseconds (cumulative count 51240) 256s 75.000% <= 1.399 milliseconds (cumulative count 75630) 256s 87.500% <= 1.559 milliseconds (cumulative count 87690) 256s 93.750% <= 1.655 milliseconds (cumulative count 94060) 256s 96.875% <= 1.727 milliseconds (cumulative count 96980) 256s 98.438% <= 1.815 milliseconds (cumulative count 98510) 256s 99.219% <= 1.895 milliseconds (cumulative count 99250) 256s 99.609% <= 2.071 milliseconds (cumulative count 99610) 256s 99.805% <= 2.367 milliseconds (cumulative count 99810) 256s 99.902% <= 2.471 milliseconds (cumulative count 99910) 256s 99.951% <= 2.535 milliseconds (cumulative count 99960) 256s 99.976% <= 2.575 milliseconds (cumulative count 99980) 256s 99.988% <= 2.623 milliseconds (cumulative count 99990) 256s 99.994% <= 2.983 milliseconds (cumulative count 100000) 256s 100.000% <= 2.983 milliseconds (cumulative count 100000) 256s 256s Cumulative distribution of latencies: 256s 0.000% <= 0.103 milliseconds (cumulative count 0) 256s 0.020% <= 0.503 milliseconds (cumulative count 20) 256s 0.490% <= 0.607 milliseconds (cumulative count 490) 256s 2.380% <= 0.703 milliseconds (cumulative count 2380) 256s 5.490% <= 0.807 milliseconds (cumulative count 5490) 256s 10.600% <= 0.903 milliseconds (cumulative count 10600) 256s 21.420% <= 1.007 milliseconds (cumulative count 21420) 256s 37.100% <= 1.103 milliseconds (cumulative count 37100) 256s 54.820% <= 1.207 milliseconds (cumulative count 54820) 256s 66.580% <= 1.303 milliseconds (cumulative count 66580) 256s 76.190% <= 1.407 milliseconds (cumulative count 76190) 256s 83.650% <= 1.503 milliseconds (cumulative count 83650) 256s 91.010% <= 1.607 milliseconds (cumulative count 91010) 256s 96.270% <= 1.703 milliseconds (cumulative count 96270) 256s 98.410% <= 1.807 milliseconds (cumulative count 98410) 256s 99.290% <= 1.903 milliseconds (cumulative count 99290) 256s 99.530% <= 2.007 milliseconds (cumulative count 99530) 256s 99.640% <= 2.103 milliseconds (cumulative count 99640) 256s 100.000% <= 3.103 milliseconds (cumulative count 100000) 256s 256s Summary: 256s throughput summary: 349650.34 requests per second 256s latency summary (msec): 256s avg min p50 p95 p99 max 256s 1.214 0.480 1.183 1.679 1.863 2.983 257s PING_MBULK: rps=317640.0 (overall: 374575.5) avg_msec=1.107 (overall: 1.107) ====== PING_MBULK ====== 257s 100000 requests completed in 0.27 seconds 257s 50 parallel clients 257s 3 bytes payload 257s keep alive: 1 257s host configuration "save": 3600 1 300 100 60 10000 257s host configuration "appendonly": no 257s multi-thread: no 257s 257s Latency by percentile distribution: 257s 0.000% <= 0.455 milliseconds (cumulative count 10) 257s 50.000% <= 1.087 milliseconds (cumulative count 50100) 257s 75.000% <= 1.279 milliseconds (cumulative count 75010) 257s 87.500% <= 1.431 milliseconds (cumulative count 87770) 257s 93.750% <= 1.535 milliseconds (cumulative count 94170) 257s 96.875% <= 1.599 milliseconds (cumulative count 96920) 257s 98.438% <= 1.679 milliseconds (cumulative count 98440) 257s 99.219% <= 1.743 milliseconds (cumulative count 99220) 257s 99.609% <= 1.807 milliseconds (cumulative count 99650) 257s 99.805% <= 1.895 milliseconds (cumulative count 99820) 257s 99.902% <= 1.935 milliseconds (cumulative count 99910) 257s 99.951% <= 1.991 milliseconds (cumulative count 99960) 257s 99.976% <= 2.031 milliseconds (cumulative count 99980) 257s 99.988% <= 2.055 milliseconds (cumulative count 99990) 257s 99.994% <= 2.079 milliseconds (cumulative count 100000) 257s 100.000% <= 2.079 milliseconds (cumulative count 100000) 257s 257s Cumulative distribution of latencies: 257s 0.000% <= 0.103 milliseconds (cumulative count 0) 257s 0.150% <= 0.503 milliseconds (cumulative count 150) 257s 2.020% <= 0.607 milliseconds (cumulative count 2020) 257s 6.070% <= 0.703 milliseconds (cumulative count 6070) 257s 12.290% <= 0.807 milliseconds (cumulative count 12290) 257s 21.510% <= 0.903 milliseconds (cumulative count 21510) 257s 35.910% <= 1.007 milliseconds (cumulative count 35910) 257s 53.090% <= 1.103 milliseconds (cumulative count 53090) 257s 67.480% <= 1.207 milliseconds (cumulative count 67480) 257s 77.210% <= 1.303 milliseconds (cumulative count 77210) 257s 85.940% <= 1.407 milliseconds (cumulative count 85940) 257s 92.330% <= 1.503 milliseconds (cumulative count 92330) 257s 97.160% <= 1.607 milliseconds (cumulative count 97160) 257s 98.760% <= 1.703 milliseconds (cumulative count 98760) 257s 99.650% <= 1.807 milliseconds (cumulative count 99650) 257s 99.840% <= 1.903 milliseconds (cumulative count 99840) 257s 99.960% <= 2.007 milliseconds (cumulative count 99960) 257s 100.000% <= 2.103 milliseconds (cumulative count 100000) 257s 257s Summary: 257s throughput summary: 374531.84 requests per second 257s latency summary (msec): 257s avg min p50 p95 p99 max 257s 1.107 0.448 1.087 1.559 1.727 2.079 257s SET: rps=281115.6 (overall: 367500.0) avg_msec=1.202 (overall: 1.202) ====== SET ====== 257s 100000 requests completed in 0.26 seconds 257s 50 parallel clients 257s 3 bytes payload 257s keep alive: 1 257s host configuration "save": 3600 1 300 100 60 10000 257s host configuration "appendonly": no 257s multi-thread: no 257s 257s Latency by percentile distribution: 257s 0.000% <= 0.311 milliseconds (cumulative count 10) 257s 50.000% <= 1.143 milliseconds (cumulative count 50140) 257s 75.000% <= 1.295 milliseconds (cumulative count 75330) 257s 87.500% <= 1.439 milliseconds (cumulative count 87880) 257s 93.750% <= 1.631 milliseconds (cumulative count 93820) 257s 96.875% <= 1.767 milliseconds (cumulative count 97050) 257s 98.438% <= 1.839 milliseconds (cumulative count 98440) 257s 99.219% <= 1.935 milliseconds (cumulative count 99250) 257s 99.609% <= 2.079 milliseconds (cumulative count 99620) 257s 99.805% <= 2.359 milliseconds (cumulative count 99820) 257s 99.902% <= 2.639 milliseconds (cumulative count 99910) 257s 99.951% <= 2.839 milliseconds (cumulative count 99960) 257s 99.976% <= 2.935 milliseconds (cumulative count 99980) 257s 99.988% <= 2.967 milliseconds (cumulative count 99990) 257s 99.994% <= 2.999 milliseconds (cumulative count 100000) 257s 100.000% <= 2.999 milliseconds (cumulative count 100000) 257s 257s Cumulative distribution of latencies: 257s 0.000% <= 0.103 milliseconds (cumulative count 0) 257s 0.040% <= 0.407 milliseconds (cumulative count 40) 257s 0.190% <= 0.503 milliseconds (cumulative count 190) 257s 0.360% <= 0.607 milliseconds (cumulative count 360) 257s 1.090% <= 0.703 milliseconds (cumulative count 1090) 257s 7.810% <= 0.807 milliseconds (cumulative count 7810) 257s 14.030% <= 0.903 milliseconds (cumulative count 14030) 257s 23.950% <= 1.007 milliseconds (cumulative count 23950) 257s 41.970% <= 1.103 milliseconds (cumulative count 41970) 257s 62.180% <= 1.207 milliseconds (cumulative count 62180) 257s 76.500% <= 1.303 milliseconds (cumulative count 76500) 257s 86.240% <= 1.407 milliseconds (cumulative count 86240) 257s 90.210% <= 1.503 milliseconds (cumulative count 90210) 257s 93.210% <= 1.607 milliseconds (cumulative count 93210) 257s 95.590% <= 1.703 milliseconds (cumulative count 95590) 257s 97.850% <= 1.807 milliseconds (cumulative count 97850) 257s 99.100% <= 1.903 milliseconds (cumulative count 99100) 257s 99.480% <= 2.007 milliseconds (cumulative count 99480) 257s 99.630% <= 2.103 milliseconds (cumulative count 99630) 257s 100.000% <= 3.103 milliseconds (cumulative count 100000) 257s 257s Summary: 257s throughput summary: 383141.75 requests per second 257s latency summary (msec): 257s avg min p50 p95 p99 max 257s 1.168 0.304 1.143 1.679 1.895 2.999 257s GET: rps=340520.0 (overall: 475586.6) avg_msec=0.950 (overall: 0.950) ====== GET ====== 257s 100000 requests completed in 0.21 seconds 257s 50 parallel clients 257s 3 bytes payload 257s keep alive: 1 257s host configuration "save": 3600 1 300 100 60 10000 257s host configuration "appendonly": no 257s multi-thread: no 257s 257s Latency by percentile distribution: 257s 0.000% <= 0.295 milliseconds (cumulative count 10) 257s 50.000% <= 0.959 milliseconds (cumulative count 51270) 257s 75.000% <= 1.071 milliseconds (cumulative count 75390) 257s 87.500% <= 1.151 milliseconds (cumulative count 87740) 257s 93.750% <= 1.199 milliseconds (cumulative count 93750) 257s 96.875% <= 1.255 milliseconds (cumulative count 97170) 257s 98.438% <= 1.311 milliseconds (cumulative count 98480) 257s 99.219% <= 1.375 milliseconds (cumulative count 99340) 257s 99.609% <= 1.415 milliseconds (cumulative count 99650) 257s 99.805% <= 1.455 milliseconds (cumulative count 99820) 257s 99.902% <= 1.495 milliseconds (cumulative count 99910) 257s 99.951% <= 1.519 milliseconds (cumulative count 99970) 257s 99.976% <= 1.527 milliseconds (cumulative count 99980) 257s 99.988% <= 1.551 milliseconds (cumulative count 99990) 257s 99.994% <= 1.599 milliseconds (cumulative count 100000) 257s 100.000% <= 1.599 milliseconds (cumulative count 100000) 257s 257s Cumulative distribution of latencies: 257s 0.000% <= 0.103 milliseconds (cumulative count 0) 257s 0.020% <= 0.303 milliseconds (cumulative count 20) 257s 0.260% <= 0.407 milliseconds (cumulative count 260) 257s 0.480% <= 0.503 milliseconds (cumulative count 480) 257s 1.260% <= 0.607 milliseconds (cumulative count 1260) 257s 6.810% <= 0.703 milliseconds (cumulative count 6810) 257s 20.930% <= 0.807 milliseconds (cumulative count 20930) 257s 38.230% <= 0.903 milliseconds (cumulative count 38230) 257s 62.930% <= 1.007 milliseconds (cumulative count 62930) 257s 80.520% <= 1.103 milliseconds (cumulative count 80520) 257s 94.550% <= 1.207 milliseconds (cumulative count 94550) 257s 98.350% <= 1.303 milliseconds (cumulative count 98350) 257s 99.570% <= 1.407 milliseconds (cumulative count 99570) 257s 99.930% <= 1.503 milliseconds (cumulative count 99930) 257s 100.000% <= 1.607 milliseconds (cumulative count 100000) 257s 257s Summary: 257s throughput summary: 476190.50 requests per second 257s latency summary (msec): 257s avg min p50 p95 p99 max 257s 0.952 0.288 0.959 1.215 1.351 1.599 257s ====== INCR ====== 257s 100000 requests completed in 0.22 seconds 257s 50 parallel clients 257s 3 bytes payload 257s keep alive: 1 257s host configuration "save": 3600 1 300 100 60 10000 257s host configuration "appendonly": no 257s multi-thread: no 257s 257s Latency by percentile distribution: 257s 0.000% <= 0.311 milliseconds (cumulative count 10) 257s 50.000% <= 0.991 milliseconds (cumulative count 51540) 257s 75.000% <= 1.103 milliseconds (cumulative count 75690) 257s 87.500% <= 1.183 milliseconds (cumulative count 88250) 257s 93.750% <= 1.231 milliseconds (cumulative count 94480) 257s 96.875% <= 1.271 milliseconds (cumulative count 97200) 257s 98.438% <= 1.319 milliseconds (cumulative count 98490) 257s 99.219% <= 1.399 milliseconds (cumulative count 99250) 257s 99.609% <= 1.543 milliseconds (cumulative count 99630) 257s 99.805% <= 1.671 milliseconds (cumulative count 99810) 257s 99.902% <= 2.167 milliseconds (cumulative count 99910) 257s 99.951% <= 2.543 milliseconds (cumulative count 99960) 257s 99.976% <= 2.591 milliseconds (cumulative count 99980) 257s 99.988% <= 2.615 milliseconds (cumulative count 99990) 257s 99.994% <= 2.631 milliseconds (cumulative count 100000) 257s 100.000% <= 2.631 milliseconds (cumulative count 100000) 257s 257s Cumulative distribution of latencies: 257s 0.000% <= 0.103 milliseconds (cumulative count 0) 257s 0.050% <= 0.407 milliseconds (cumulative count 50) 257s 0.120% <= 0.503 milliseconds (cumulative count 120) 257s 0.360% <= 0.607 milliseconds (cumulative count 360) 257s 4.550% <= 0.703 milliseconds (cumulative count 4550) 257s 17.970% <= 0.807 milliseconds (cumulative count 17970) 257s 30.780% <= 0.903 milliseconds (cumulative count 30780) 257s 55.690% <= 1.007 milliseconds (cumulative count 55690) 257s 75.690% <= 1.103 milliseconds (cumulative count 75690) 257s 91.640% <= 1.207 milliseconds (cumulative count 91640) 257s 98.190% <= 1.303 milliseconds (cumulative count 98190) 257s 99.290% <= 1.407 milliseconds (cumulative count 99290) 257s 99.540% <= 1.503 milliseconds (cumulative count 99540) 257s 99.750% <= 1.607 milliseconds (cumulative count 99750) 257s 99.830% <= 1.703 milliseconds (cumulative count 99830) 257s 99.900% <= 1.807 milliseconds (cumulative count 99900) 257s 100.000% <= 3.103 milliseconds (cumulative count 100000) 257s 257s Summary: 257s throughput summary: 462962.94 requests per second 257s latency summary (msec): 257s avg min p50 p95 p99 max 257s 0.983 0.304 0.991 1.239 1.367 2.631 258s LPUSH: rps=0.0 (overall: 0.0) avg_msec=nan (overall: nan) LPUSH: rps=347160.0 (overall: 345776.9) avg_msec=1.331 (overall: 1.331) ====== LPUSH ====== 258s 100000 requests completed in 0.29 seconds 258s 50 parallel clients 258s 3 bytes payload 258s keep alive: 1 258s host configuration "save": 3600 1 300 100 60 10000 258s host configuration "appendonly": no 258s multi-thread: no 258s 258s Latency by percentile distribution: 258s 0.000% <= 0.423 milliseconds (cumulative count 10) 258s 50.000% <= 1.343 milliseconds (cumulative count 50290) 258s 75.000% <= 1.487 milliseconds (cumulative count 76250) 258s 87.500% <= 1.583 milliseconds (cumulative count 87990) 258s 93.750% <= 1.663 milliseconds (cumulative count 94080) 258s 96.875% <= 1.735 milliseconds (cumulative count 97040) 258s 98.438% <= 1.799 milliseconds (cumulative count 98470) 258s 99.219% <= 1.895 milliseconds (cumulative count 99230) 258s 99.609% <= 2.199 milliseconds (cumulative count 99610) 258s 99.805% <= 2.471 milliseconds (cumulative count 99810) 258s 99.902% <= 2.583 milliseconds (cumulative count 99910) 258s 99.951% <= 2.663 milliseconds (cumulative count 99960) 258s 99.976% <= 2.703 milliseconds (cumulative count 99980) 258s 99.988% <= 2.719 milliseconds (cumulative count 99990) 258s 99.994% <= 2.751 milliseconds (cumulative count 100000) 258s 100.000% <= 2.751 milliseconds (cumulative count 100000) 258s 258s Cumulative distribution of latencies: 258s 0.000% <= 0.103 milliseconds (cumulative count 0) 258s 0.090% <= 0.503 milliseconds (cumulative count 90) 258s 0.240% <= 0.607 milliseconds (cumulative count 240) 258s 0.630% <= 0.703 milliseconds (cumulative count 630) 258s 1.850% <= 0.807 milliseconds (cumulative count 1850) 258s 4.500% <= 0.903 milliseconds (cumulative count 4500) 258s 9.180% <= 1.007 milliseconds (cumulative count 9180) 258s 15.900% <= 1.103 milliseconds (cumulative count 15900) 258s 28.370% <= 1.207 milliseconds (cumulative count 28370) 258s 43.180% <= 1.303 milliseconds (cumulative count 43180) 258s 62.370% <= 1.407 milliseconds (cumulative count 62370) 258s 78.430% <= 1.503 milliseconds (cumulative count 78430) 258s 90.230% <= 1.607 milliseconds (cumulative count 90230) 258s 95.980% <= 1.703 milliseconds (cumulative count 95980) 258s 98.600% <= 1.807 milliseconds (cumulative count 98600) 258s 99.250% <= 1.903 milliseconds (cumulative count 99250) 258s 99.510% <= 2.007 milliseconds (cumulative count 99510) 258s 99.560% <= 2.103 milliseconds (cumulative count 99560) 258s 100.000% <= 3.103 milliseconds (cumulative count 100000) 258s 258s Summary: 258s throughput summary: 346020.75 requests per second 258s latency summary (msec): 258s avg min p50 p95 p99 max 258s 1.329 0.416 1.343 1.687 1.855 2.751 258s RPUSH: rps=352270.9 (overall: 419052.2) avg_msec=1.096 (overall: 1.096) ====== RPUSH ====== 258s 100000 requests completed in 0.24 seconds 258s 50 parallel clients 258s 3 bytes payload 258s keep alive: 1 258s host configuration "save": 3600 1 300 100 60 10000 258s host configuration "appendonly": no 258s multi-thread: no 258s 258s Latency by percentile distribution: 258s 0.000% <= 0.271 milliseconds (cumulative count 10) 258s 50.000% <= 1.119 milliseconds (cumulative count 51210) 258s 75.000% <= 1.231 milliseconds (cumulative count 75250) 258s 87.500% <= 1.311 milliseconds (cumulative count 87990) 258s 93.750% <= 1.359 milliseconds (cumulative count 94580) 258s 96.875% <= 1.391 milliseconds (cumulative count 97190) 258s 98.438% <= 1.431 milliseconds (cumulative count 98640) 258s 99.219% <= 1.487 milliseconds (cumulative count 99290) 258s 99.609% <= 1.559 milliseconds (cumulative count 99610) 258s 99.805% <= 1.623 milliseconds (cumulative count 99840) 258s 99.902% <= 1.655 milliseconds (cumulative count 99920) 258s 99.951% <= 1.679 milliseconds (cumulative count 99960) 258s 99.976% <= 1.695 milliseconds (cumulative count 99980) 258s 99.988% <= 1.703 milliseconds (cumulative count 99990) 258s 99.994% <= 1.727 milliseconds (cumulative count 100000) 258s 100.000% <= 1.727 milliseconds (cumulative count 100000) 258s 258s Cumulative distribution of latencies: 258s 0.000% <= 0.103 milliseconds (cumulative count 0) 258s 0.020% <= 0.303 milliseconds (cumulative count 20) 258s 0.160% <= 0.407 milliseconds (cumulative count 160) 258s 0.250% <= 0.503 milliseconds (cumulative count 250) 258s 0.370% <= 0.607 milliseconds (cumulative count 370) 258s 0.850% <= 0.703 milliseconds (cumulative count 850) 258s 10.600% <= 0.807 milliseconds (cumulative count 10600) 258s 19.070% <= 0.903 milliseconds (cumulative count 19070) 258s 26.940% <= 1.007 milliseconds (cumulative count 26940) 258s 47.170% <= 1.103 milliseconds (cumulative count 47170) 258s 71.080% <= 1.207 milliseconds (cumulative count 71080) 258s 86.760% <= 1.303 milliseconds (cumulative count 86760) 258s 97.930% <= 1.407 milliseconds (cumulative count 97930) 258s 99.350% <= 1.503 milliseconds (cumulative count 99350) 258s 99.750% <= 1.607 milliseconds (cumulative count 99750) 258s 99.990% <= 1.703 milliseconds (cumulative count 99990) 258s 100.000% <= 1.807 milliseconds (cumulative count 100000) 258s 258s Summary: 258s throughput summary: 420168.06 requests per second 258s latency summary (msec): 258s avg min p50 p95 p99 max 258s 1.096 0.264 1.119 1.367 1.463 1.727 258s LPOP: rps=325000.0 (overall: 367647.1) avg_msec=1.259 (overall: 1.259) ====== LPOP ====== 258s 100000 requests completed in 0.27 seconds 258s 50 parallel clients 258s 3 bytes payload 258s keep alive: 1 258s host configuration "save": 3600 1 300 100 60 10000 258s host configuration "appendonly": no 258s multi-thread: no 258s 258s Latency by percentile distribution: 258s 0.000% <= 0.263 milliseconds (cumulative count 10) 258s 50.000% <= 1.263 milliseconds (cumulative count 50600) 258s 75.000% <= 1.399 milliseconds (cumulative count 76220) 258s 87.500% <= 1.487 milliseconds (cumulative count 88330) 258s 93.750% <= 1.559 milliseconds (cumulative count 94250) 258s 96.875% <= 1.647 milliseconds (cumulative count 96920) 258s 98.438% <= 1.943 milliseconds (cumulative count 98440) 258s 99.219% <= 2.671 milliseconds (cumulative count 99220) 258s 99.609% <= 3.479 milliseconds (cumulative count 99610) 258s 99.805% <= 3.607 milliseconds (cumulative count 99810) 258s 99.902% <= 3.711 milliseconds (cumulative count 99910) 258s 99.951% <= 3.807 milliseconds (cumulative count 99960) 258s 99.976% <= 3.831 milliseconds (cumulative count 99980) 258s 99.988% <= 3.847 milliseconds (cumulative count 99990) 258s 99.994% <= 3.855 milliseconds (cumulative count 100000) 258s 100.000% <= 3.855 milliseconds (cumulative count 100000) 258s 258s Cumulative distribution of latencies: 258s 0.000% <= 0.103 milliseconds (cumulative count 0) 258s 0.030% <= 0.303 milliseconds (cumulative count 30) 258s 0.280% <= 0.407 milliseconds (cumulative count 280) 258s 0.500% <= 0.503 milliseconds (cumulative count 500) 258s 0.690% <= 0.607 milliseconds (cumulative count 690) 258s 1.060% <= 0.703 milliseconds (cumulative count 1060) 258s 2.800% <= 0.807 milliseconds (cumulative count 2800) 258s 12.010% <= 0.903 milliseconds (cumulative count 12010) 258s 18.210% <= 1.007 milliseconds (cumulative count 18210) 258s 23.100% <= 1.103 milliseconds (cumulative count 23100) 258s 38.600% <= 1.207 milliseconds (cumulative count 38600) 258s 59.110% <= 1.303 milliseconds (cumulative count 59110) 258s 77.560% <= 1.407 milliseconds (cumulative count 77560) 258s 89.950% <= 1.503 milliseconds (cumulative count 89950) 258s 96.230% <= 1.607 milliseconds (cumulative count 96230) 258s 97.480% <= 1.703 milliseconds (cumulative count 97480) 258s 98.110% <= 1.807 milliseconds (cumulative count 98110) 258s 98.400% <= 1.903 milliseconds (cumulative count 98400) 258s 98.470% <= 2.007 milliseconds (cumulative count 98470) 258s 98.580% <= 2.103 milliseconds (cumulative count 98580) 258s 99.500% <= 3.103 milliseconds (cumulative count 99500) 258s 100.000% <= 4.103 milliseconds (cumulative count 100000) 258s 258s Summary: 258s throughput summary: 369003.69 requests per second 258s latency summary (msec): 258s avg min p50 p95 p99 max 258s 1.256 0.256 1.263 1.575 2.447 3.855 259s RPOP: rps=318884.5 (overall: 402211.1) avg_msec=1.141 (overall: 1.141) ====== RPOP ====== 259s 100000 requests completed in 0.25 seconds 259s 50 parallel clients 259s 3 bytes payload 259s keep alive: 1 259s host configuration "save": 3600 1 300 100 60 10000 259s host configuration "appendonly": no 259s multi-thread: no 259s 259s Latency by percentile distribution: 259s 0.000% <= 0.319 milliseconds (cumulative count 10) 259s 50.000% <= 1.167 milliseconds (cumulative count 51790) 259s 75.000% <= 1.279 milliseconds (cumulative count 75150) 259s 87.500% <= 1.359 milliseconds (cumulative count 87610) 259s 93.750% <= 1.407 milliseconds (cumulative count 93920) 259s 96.875% <= 1.447 milliseconds (cumulative count 97050) 259s 98.438% <= 1.487 milliseconds (cumulative count 98440) 259s 99.219% <= 1.535 milliseconds (cumulative count 99260) 259s 99.609% <= 1.583 milliseconds (cumulative count 99610) 259s 99.805% <= 1.639 milliseconds (cumulative count 99830) 259s 99.902% <= 1.671 milliseconds (cumulative count 99910) 259s 99.951% <= 1.695 milliseconds (cumulative count 99960) 259s 99.976% <= 1.703 milliseconds (cumulative count 99980) 259s 99.988% <= 1.719 milliseconds (cumulative count 99990) 259s 99.994% <= 1.735 milliseconds (cumulative count 100000) 259s 100.000% <= 1.735 milliseconds (cumulative count 100000) 259s 259s Cumulative distribution of latencies: 259s 0.000% <= 0.103 milliseconds (cumulative count 0) 259s 0.070% <= 0.407 milliseconds (cumulative count 70) 259s 0.160% <= 0.503 milliseconds (cumulative count 160) 259s 0.320% <= 0.607 milliseconds (cumulative count 320) 259s 0.750% <= 0.703 milliseconds (cumulative count 750) 259s 5.800% <= 0.807 milliseconds (cumulative count 5800) 259s 17.010% <= 0.903 milliseconds (cumulative count 17010) 259s 22.000% <= 1.007 milliseconds (cumulative count 22000) 259s 36.700% <= 1.103 milliseconds (cumulative count 36700) 259s 61.330% <= 1.207 milliseconds (cumulative count 61330) 259s 79.080% <= 1.303 milliseconds (cumulative count 79080) 259s 93.920% <= 1.407 milliseconds (cumulative count 93920) 259s 98.830% <= 1.503 milliseconds (cumulative count 98830) 259s 99.720% <= 1.607 milliseconds (cumulative count 99720) 259s 99.980% <= 1.703 milliseconds (cumulative count 99980) 259s 100.000% <= 1.807 milliseconds (cumulative count 100000) 259s 259s Summary: 259s throughput summary: 403225.81 requests per second 259s latency summary (msec): 259s avg min p50 p95 p99 max 259s 1.140 0.312 1.167 1.423 1.519 1.735 259s SADD: rps=361673.3 (overall: 453900.0) avg_msec=1.006 (overall: 1.006) ====== SADD ====== 259s 100000 requests completed in 0.22 seconds 259s 50 parallel clients 259s 3 bytes payload 259s keep alive: 1 259s host configuration "save": 3600 1 300 100 60 10000 259s host configuration "appendonly": no 259s multi-thread: no 259s 259s Latency by percentile distribution: 259s 0.000% <= 0.303 milliseconds (cumulative count 10) 259s 50.000% <= 1.023 milliseconds (cumulative count 51910) 259s 75.000% <= 1.127 milliseconds (cumulative count 75120) 259s 87.500% <= 1.207 milliseconds (cumulative count 88560) 259s 93.750% <= 1.247 milliseconds (cumulative count 93920) 259s 96.875% <= 1.287 milliseconds (cumulative count 97260) 259s 98.438% <= 1.335 milliseconds (cumulative count 98560) 259s 99.219% <= 1.407 milliseconds (cumulative count 99280) 259s 99.609% <= 1.495 milliseconds (cumulative count 99620) 259s 99.805% <= 1.583 milliseconds (cumulative count 99810) 259s 99.902% <= 1.687 milliseconds (cumulative count 99910) 259s 99.951% <= 1.807 milliseconds (cumulative count 99960) 259s 99.976% <= 1.863 milliseconds (cumulative count 99980) 259s 99.988% <= 1.879 milliseconds (cumulative count 99990) 259s 99.994% <= 1.903 milliseconds (cumulative count 100000) 259s 100.000% <= 1.903 milliseconds (cumulative count 100000) 259s 259s Cumulative distribution of latencies: 259s 0.000% <= 0.103 milliseconds (cumulative count 0) 259s 0.010% <= 0.303 milliseconds (cumulative count 10) 259s 0.190% <= 0.407 milliseconds (cumulative count 190) 259s 0.470% <= 0.503 milliseconds (cumulative count 470) 259s 0.760% <= 0.607 milliseconds (cumulative count 760) 259s 3.920% <= 0.703 milliseconds (cumulative count 3920) 259s 17.240% <= 0.807 milliseconds (cumulative count 17240) 259s 26.070% <= 0.903 milliseconds (cumulative count 26070) 259s 47.920% <= 1.007 milliseconds (cumulative count 47920) 259s 70.670% <= 1.103 milliseconds (cumulative count 70670) 259s 88.560% <= 1.207 milliseconds (cumulative count 88560) 259s 97.890% <= 1.303 milliseconds (cumulative count 97890) 259s 99.280% <= 1.407 milliseconds (cumulative count 99280) 259s 99.650% <= 1.503 milliseconds (cumulative count 99650) 259s 99.840% <= 1.607 milliseconds (cumulative count 99840) 259s 99.920% <= 1.703 milliseconds (cumulative count 99920) 259s 99.960% <= 1.807 milliseconds (cumulative count 99960) 259s 100.000% <= 1.903 milliseconds (cumulative count 100000) 259s 259s Summary: 259s throughput summary: 454545.47 requests per second 259s latency summary (msec): 259s avg min p50 p95 p99 max 259s 1.004 0.296 1.023 1.263 1.367 1.903 259s HSET: rps=382080.0 (overall: 418947.4) avg_msec=1.100 (overall: 1.100) ====== HSET ====== 259s 100000 requests completed in 0.24 seconds 259s 50 parallel clients 259s 3 bytes payload 259s keep alive: 1 259s host configuration "save": 3600 1 300 100 60 10000 259s host configuration "appendonly": no 259s multi-thread: no 259s 259s Latency by percentile distribution: 259s 0.000% <= 0.343 milliseconds (cumulative count 10) 259s 50.000% <= 1.119 milliseconds (cumulative count 50600) 259s 75.000% <= 1.239 milliseconds (cumulative count 76360) 259s 87.500% <= 1.311 milliseconds (cumulative count 88540) 259s 93.750% <= 1.351 milliseconds (cumulative count 94110) 259s 96.875% <= 1.391 milliseconds (cumulative count 97340) 259s 98.438% <= 1.423 milliseconds (cumulative count 98570) 259s 99.219% <= 1.463 milliseconds (cumulative count 99240) 259s 99.609% <= 1.519 milliseconds (cumulative count 99650) 259s 99.805% <= 1.567 milliseconds (cumulative count 99810) 259s 99.902% <= 1.607 milliseconds (cumulative count 99910) 259s 99.951% <= 1.655 milliseconds (cumulative count 99960) 259s 99.976% <= 1.679 milliseconds (cumulative count 99980) 259s 99.988% <= 1.767 milliseconds (cumulative count 99990) 259s 99.994% <= 1.847 milliseconds (cumulative count 100000) 259s 100.000% <= 1.847 milliseconds (cumulative count 100000) 259s 259s Cumulative distribution of latencies: 259s 0.000% <= 0.103 milliseconds (cumulative count 0) 259s 0.030% <= 0.407 milliseconds (cumulative count 30) 259s 0.110% <= 0.503 milliseconds (cumulative count 110) 259s 0.160% <= 0.607 milliseconds (cumulative count 160) 259s 0.720% <= 0.703 milliseconds (cumulative count 720) 259s 9.940% <= 0.807 milliseconds (cumulative count 9940) 259s 17.760% <= 0.903 milliseconds (cumulative count 17760) 259s 26.240% <= 1.007 milliseconds (cumulative count 26240) 259s 46.640% <= 1.103 milliseconds (cumulative count 46640) 259s 70.390% <= 1.207 milliseconds (cumulative count 70390) 259s 87.140% <= 1.303 milliseconds (cumulative count 87140) 259s 98.000% <= 1.407 milliseconds (cumulative count 98000) 259s 99.560% <= 1.503 milliseconds (cumulative count 99560) 259s 99.910% <= 1.607 milliseconds (cumulative count 99910) 259s 99.980% <= 1.703 milliseconds (cumulative count 99980) 259s 99.990% <= 1.807 milliseconds (cumulative count 99990) 259s 100.000% <= 1.903 milliseconds (cumulative count 100000) 259s 259s Summary: 259s throughput summary: 418410.06 requests per second 259s latency summary (msec): 259s avg min p50 p95 p99 max 259s 1.099 0.336 1.119 1.367 1.447 1.847 259s ====== SPOP ====== 259s 100000 requests completed in 0.21 seconds 259s 50 parallel clients 259s 3 bytes payload 259s keep alive: 1 259s host configuration "save": 3600 1 300 100 60 10000 259s host configuration "appendonly": no 259s multi-thread: no 259s 259s Latency by percentile distribution: 259s 0.000% <= 0.287 milliseconds (cumulative count 10) 259s 50.000% <= 0.919 milliseconds (cumulative count 50900) 259s 75.000% <= 1.039 milliseconds (cumulative count 75460) 259s 87.500% <= 1.119 milliseconds (cumulative count 88360) 259s 93.750% <= 1.175 milliseconds (cumulative count 94150) 259s 96.875% <= 1.279 milliseconds (cumulative count 96970) 259s 98.438% <= 1.615 milliseconds (cumulative count 98450) 259s 99.219% <= 2.151 milliseconds (cumulative count 99220) 259s 99.609% <= 4.519 milliseconds (cumulative count 99610) 259s 99.805% <= 4.759 milliseconds (cumulative count 99810) 259s 99.902% <= 4.863 milliseconds (cumulative count 99910) 259s 99.951% <= 4.919 milliseconds (cumulative count 99960) 259s 99.976% <= 4.943 milliseconds (cumulative count 99980) 259s 99.988% <= 4.951 milliseconds (cumulative count 99990) 259s 99.994% <= 4.991 milliseconds (cumulative count 100000) 259s 100.000% <= 4.991 milliseconds (cumulative count 100000) 259s 259s Cumulative distribution of latencies: 259s 0.000% <= 0.103 milliseconds (cumulative count 0) 259s 0.040% <= 0.303 milliseconds (cumulative count 40) 259s 0.470% <= 0.407 milliseconds (cumulative count 470) 259s 0.830% <= 0.503 milliseconds (cumulative count 830) 259s 1.820% <= 0.607 milliseconds (cumulative count 1820) 259s 12.510% <= 0.703 milliseconds (cumulative count 12510) 259s 27.450% <= 0.807 milliseconds (cumulative count 27450) 259s 47.190% <= 0.903 milliseconds (cumulative count 47190) 259s 69.930% <= 1.007 milliseconds (cumulative count 69930) 259s 85.910% <= 1.103 milliseconds (cumulative count 85910) 259s 95.600% <= 1.207 milliseconds (cumulative count 95600) 259s 97.240% <= 1.303 milliseconds (cumulative count 97240) 259s 97.900% <= 1.407 milliseconds (cumulative count 97900) 259s 98.150% <= 1.503 milliseconds (cumulative count 98150) 259s 98.420% <= 1.607 milliseconds (cumulative count 98420) 259s 98.650% <= 1.703 milliseconds (cumulative count 98650) 259s 98.800% <= 1.807 milliseconds (cumulative count 98800) 259s 98.900% <= 1.903 milliseconds (cumulative count 98900) 259s 99.020% <= 2.007 milliseconds (cumulative count 99020) 259s 99.140% <= 2.103 milliseconds (cumulative count 99140) 259s 99.500% <= 3.103 milliseconds (cumulative count 99500) 259s 100.000% <= 5.103 milliseconds (cumulative count 100000) 259s 259s Summary: 259s throughput summary: 480769.22 requests per second 259s latency summary (msec): 259s avg min p50 p95 p99 max 259s 0.941 0.280 0.919 1.199 1.991 4.991 259s ZADD: rps=42828.7 (overall: 383928.6) avg_msec=1.160 (overall: 1.160) ====== ZADD ====== 259s 100000 requests completed in 0.25 seconds 259s 50 parallel clients 259s 3 bytes payload 259s keep alive: 1 259s host configuration "save": 3600 1 300 100 60 10000 259s host configuration "appendonly": no 259s multi-thread: no 259s 259s Latency by percentile distribution: 259s 0.000% <= 0.359 milliseconds (cumulative count 10) 259s 50.000% <= 1.183 milliseconds (cumulative count 51410) 259s 75.000% <= 1.303 milliseconds (cumulative count 76280) 259s 87.500% <= 1.383 milliseconds (cumulative count 88620) 259s 93.750% <= 1.431 milliseconds (cumulative count 94470) 259s 96.875% <= 1.471 milliseconds (cumulative count 97030) 259s 98.438% <= 1.519 milliseconds (cumulative count 98540) 259s 99.219% <= 1.583 milliseconds (cumulative count 99270) 259s 99.609% <= 1.639 milliseconds (cumulative count 99620) 259s 99.805% <= 1.695 milliseconds (cumulative count 99820) 259s 99.902% <= 1.743 milliseconds (cumulative count 99910) 259s 99.951% <= 1.799 milliseconds (cumulative count 99960) 259s 99.976% <= 1.831 milliseconds (cumulative count 99980) 259s 99.988% <= 1.855 milliseconds (cumulative count 99990) 259s 99.994% <= 1.871 milliseconds (cumulative count 100000) 259s 100.000% <= 1.871 milliseconds (cumulative count 100000) 259s 259s Cumulative distribution of latencies: 259s 0.000% <= 0.103 milliseconds (cumulative count 0) 259s 0.100% <= 0.407 milliseconds (cumulative count 100) 259s 0.230% <= 0.503 milliseconds (cumulative count 230) 259s 0.340% <= 0.607 milliseconds (cumulative count 340) 259s 0.570% <= 0.703 milliseconds (cumulative count 570) 259s 4.930% <= 0.807 milliseconds (cumulative count 4930) 259s 15.090% <= 0.903 milliseconds (cumulative count 15090) 259s 21.090% <= 1.007 milliseconds (cumulative count 21090) 259s 34.130% <= 1.103 milliseconds (cumulative count 34130) 259s 56.900% <= 1.207 milliseconds (cumulative count 56900) 259s 76.280% <= 1.303 milliseconds (cumulative count 76280) 259s 91.930% <= 1.407 milliseconds (cumulative count 91930) 259s 98.180% <= 1.503 milliseconds (cumulative count 98180) 259s 99.440% <= 1.607 milliseconds (cumulative count 99440) 259s 99.820% <= 1.703 milliseconds (cumulative count 99820) 259s 99.960% <= 1.807 milliseconds (cumulative count 99960) 259s 100.000% <= 1.903 milliseconds (cumulative count 100000) 259s 259s Summary: 259s throughput summary: 398406.41 requests per second 259s latency summary (msec): 259s avg min p50 p95 p99 max 259s 1.157 0.352 1.183 1.439 1.559 1.871 260s ZPOPMIN: rps=46960.0 (overall: 469600.0) avg_msec=0.936 (overall: 0.936) ====== ZPOPMIN ====== 260s 100000 requests completed in 0.21 seconds 260s 50 parallel clients 260s 3 bytes payload 260s keep alive: 1 260s host configuration "save": 3600 1 300 100 60 10000 260s host configuration "appendonly": no 260s multi-thread: no 260s 260s Latency by percentile distribution: 260s 0.000% <= 0.319 milliseconds (cumulative count 10) 260s 50.000% <= 0.943 milliseconds (cumulative count 51380) 260s 75.000% <= 1.055 milliseconds (cumulative count 75030) 260s 87.500% <= 1.135 milliseconds (cumulative count 87820) 260s 93.750% <= 1.183 milliseconds (cumulative count 94560) 260s 96.875% <= 1.215 milliseconds (cumulative count 96880) 260s 98.438% <= 1.271 milliseconds (cumulative count 98500) 260s 99.219% <= 1.327 milliseconds (cumulative count 99260) 260s 99.609% <= 1.399 milliseconds (cumulative count 99610) 260s 99.805% <= 1.447 milliseconds (cumulative count 99810) 260s 99.902% <= 1.503 milliseconds (cumulative count 99920) 260s 99.951% <= 1.527 milliseconds (cumulative count 99960) 260s 99.976% <= 1.543 milliseconds (cumulative count 99980) 260s 99.988% <= 1.559 milliseconds (cumulative count 99990) 260s 99.994% <= 1.567 milliseconds (cumulative count 100000) 260s 100.000% <= 1.567 milliseconds (cumulative count 100000) 260s 260s Cumulative distribution of latencies: 260s 0.000% <= 0.103 milliseconds (cumulative count 0) 260s 0.150% <= 0.407 milliseconds (cumulative count 150) 260s 0.350% <= 0.503 milliseconds (cumulative count 350) 260s 0.680% <= 0.607 milliseconds (cumulative count 680) 260s 7.770% <= 0.703 milliseconds (cumulative count 7770) 260s 23.790% <= 0.807 milliseconds (cumulative count 23790) 260s 42.020% <= 0.903 milliseconds (cumulative count 42020) 260s 66.580% <= 1.007 milliseconds (cumulative count 66580) 260s 82.910% <= 1.103 milliseconds (cumulative count 82910) 260s 96.420% <= 1.207 milliseconds (cumulative count 96420) 260s 98.960% <= 1.303 milliseconds (cumulative count 98960) 260s 99.680% <= 1.407 milliseconds (cumulative count 99680) 260s 99.920% <= 1.503 milliseconds (cumulative count 99920) 260s 100.000% <= 1.607 milliseconds (cumulative count 100000) 260s 260s Summary: 260s throughput summary: 485436.91 requests per second 260s latency summary (msec): 260s avg min p50 p95 p99 max 260s 0.936 0.312 0.943 1.191 1.311 1.567 260s LPUSH (needed to benchmark LRANGE): rps=92111.6 (overall: 340000.0) avg_msec=1.331 (overall: 1.331) ====== LPUSH (needed to benchmark LRANGE) ====== 260s 100000 requests completed in 0.29 seconds 260s 50 parallel clients 260s 3 bytes payload 260s keep alive: 1 260s host configuration "save": 3600 1 300 100 60 10000 260s host configuration "appendonly": no 260s multi-thread: no 260s 260s Latency by percentile distribution: 260s 0.000% <= 0.367 milliseconds (cumulative count 20) 260s 50.000% <= 1.359 milliseconds (cumulative count 50790) 260s 75.000% <= 1.495 milliseconds (cumulative count 75650) 260s 87.500% <= 1.591 milliseconds (cumulative count 88100) 260s 93.750% <= 1.663 milliseconds (cumulative count 94180) 260s 96.875% <= 1.727 milliseconds (cumulative count 96990) 260s 98.438% <= 1.807 milliseconds (cumulative count 98460) 260s 99.219% <= 1.943 milliseconds (cumulative count 99220) 260s 99.609% <= 2.119 milliseconds (cumulative count 99610) 260s 99.805% <= 2.311 milliseconds (cumulative count 99810) 260s 99.902% <= 2.439 milliseconds (cumulative count 99910) 260s 99.951% <= 2.543 milliseconds (cumulative count 99960) 260s 99.976% <= 2.575 milliseconds (cumulative count 99980) 260s 99.988% <= 2.615 milliseconds (cumulative count 99990) 260s 99.994% <= 2.623 milliseconds (cumulative count 100000) 260s 100.000% <= 2.623 milliseconds (cumulative count 100000) 260s 260s Cumulative distribution of latencies: 260s 0.000% <= 0.103 milliseconds (cumulative count 0) 260s 0.050% <= 0.407 milliseconds (cumulative count 50) 260s 0.070% <= 0.503 milliseconds (cumulative count 70) 260s 0.150% <= 0.607 milliseconds (cumulative count 150) 260s 0.460% <= 0.703 milliseconds (cumulative count 460) 260s 1.240% <= 0.807 milliseconds (cumulative count 1240) 260s 4.380% <= 0.903 milliseconds (cumulative count 4380) 260s 10.640% <= 1.007 milliseconds (cumulative count 10640) 260s 15.540% <= 1.103 milliseconds (cumulative count 15540) 260s 25.470% <= 1.207 milliseconds (cumulative count 25470) 260s 40.110% <= 1.303 milliseconds (cumulative count 40110) 260s 59.940% <= 1.407 milliseconds (cumulative count 59940) 260s 76.840% <= 1.503 milliseconds (cumulative count 76840) 260s 89.700% <= 1.607 milliseconds (cumulative count 89700) 260s 96.260% <= 1.703 milliseconds (cumulative count 96260) 260s 98.460% <= 1.807 milliseconds (cumulative count 98460) 260s 98.990% <= 1.903 milliseconds (cumulative count 98990) 260s 99.480% <= 2.007 milliseconds (cumulative count 99480) 260s 99.600% <= 2.103 milliseconds (cumulative count 99600) 260s 100.000% <= 3.103 milliseconds (cumulative count 100000) 260s 260s Summary: 260s throughput summary: 344827.59 requests per second 260s latency summary (msec): 260s avg min p50 p95 p99 max 260s 1.339 0.360 1.359 1.679 1.911 2.623 261s LRANGE_100 (first 100 elements): rps=9404.8 (overall: 84642.9) avg_msec=3.492 (overall: 3.492) LRANGE_100 (first 100 elements): rps=90039.7 (overall: 89500.0) avg_msec=2.776 (overall: 2.844) LRANGE_100 (first 100 elements): rps=90000.0 (overall: 89736.3) avg_msec=2.778 (overall: 2.812) LRANGE_100 (first 100 elements): rps=89523.8 (overall: 89667.9) avg_msec=2.786 (overall: 2.804) LRANGE_100 (first 100 elements): rps=89282.9 (overall: 89574.5) avg_msec=2.799 (overall: 2.803) ====== LRANGE_100 (first 100 elements) ====== 261s 100000 requests completed in 1.12 seconds 261s 50 parallel clients 261s 3 bytes payload 261s keep alive: 1 261s host configuration "save": 3600 1 300 100 60 10000 261s host configuration "appendonly": no 261s multi-thread: no 261s 261s Latency by percentile distribution: 261s 0.000% <= 0.375 milliseconds (cumulative count 10) 261s 50.000% <= 2.783 milliseconds (cumulative count 51490) 261s 75.000% <= 2.863 milliseconds (cumulative count 75510) 261s 87.500% <= 2.935 milliseconds (cumulative count 88070) 261s 93.750% <= 3.015 milliseconds (cumulative count 94340) 261s 96.875% <= 3.111 milliseconds (cumulative count 97040) 261s 98.438% <= 3.207 milliseconds (cumulative count 98470) 261s 99.219% <= 3.479 milliseconds (cumulative count 99220) 261s 99.609% <= 4.423 milliseconds (cumulative count 99610) 261s 99.805% <= 5.559 milliseconds (cumulative count 99810) 261s 99.902% <= 6.567 milliseconds (cumulative count 99910) 261s 99.951% <= 7.119 milliseconds (cumulative count 99960) 261s 99.976% <= 7.319 milliseconds (cumulative count 99980) 261s 99.988% <= 7.471 milliseconds (cumulative count 99990) 261s 99.994% <= 7.639 milliseconds (cumulative count 100000) 261s 100.000% <= 7.639 milliseconds (cumulative count 100000) 261s 261s Cumulative distribution of latencies: 261s 0.000% <= 0.103 milliseconds (cumulative count 0) 261s 0.010% <= 0.407 milliseconds (cumulative count 10) 261s 0.020% <= 2.103 milliseconds (cumulative count 20) 261s 96.870% <= 3.103 milliseconds (cumulative count 96870) 261s 99.530% <= 4.103 milliseconds (cumulative count 99530) 261s 99.730% <= 5.103 milliseconds (cumulative count 99730) 261s 99.870% <= 6.103 milliseconds (cumulative count 99870) 261s 99.950% <= 7.103 milliseconds (cumulative count 99950) 261s 100.000% <= 8.103 milliseconds (cumulative count 100000) 261s 261s Summary: 261s throughput summary: 89525.52 requests per second 261s latency summary (msec): 261s avg min p50 p95 p99 max 261s 2.803 0.368 2.783 3.031 3.327 7.639 265s LRANGE_300 (first 300 elements): rps=14690.5 (overall: 22167.7) avg_msec=12.132 (overall: 12.132) LRANGE_300 (first 300 elements): rps=22809.5 (overall: 22553.7) avg_msec=11.805 (overall: 11.933) LRANGE_300 (first 300 elements): rps=26205.5 (overall: 23928.6) avg_msec=9.726 (overall: 11.023) LRANGE_300 (first 300 elements): rps=25108.5 (overall: 24255.9) avg_msec=10.102 (overall: 10.759) LRANGE_300 (first 300 elements): rps=26595.2 (overall: 24754.7) avg_msec=9.031 (overall: 10.363) LRANGE_300 (first 300 elements): rps=25697.7 (overall: 24923.6) avg_msec=9.540 (overall: 10.211) LRANGE_300 (first 300 elements): rps=24988.0 (overall: 24933.2) avg_msec=9.893 (overall: 10.163) LRANGE_300 (first 300 elements): rps=22860.6 (overall: 24665.3) avg_msec=12.245 (overall: 10.413) LRANGE_300 (first 300 elements): rps=22754.0 (overall: 24445.8) avg_msec=12.159 (overall: 10.600) LRANGE_300 (first 300 elements): rps=21436.5 (overall: 24135.7) avg_msec=14.405 (overall: 10.948) LRANGE_300 (first 300 elements): rps=21385.8 (overall: 23877.0) avg_msec=13.154 (overall: 11.134) LRANGE_300 (first 300 elements): rps=24848.0 (overall: 23959.3) avg_msec=10.718 (overall: 11.097) LRANGE_300 (first 300 elements): rps=20513.8 (overall: 23687.2) avg_msec=14.348 (overall: 11.319) LRANGE_300 (first 300 elements): rps=19600.0 (overall: 23385.8) avg_msec=14.870 (overall: 11.539) LRANGE_300 (first 300 elements): rps=20175.3 (overall: 23168.5) avg_msec=14.355 (overall: 11.705) LRANGE_300 (first 300 elements): rps=24787.4 (overall: 23272.3) avg_msec=10.829 (overall: 11.645) LRANGE_300 (first 300 elements): rps=21556.4 (overall: 23167.8) avg_msec=13.285 (overall: 11.738) ====== LRANGE_300 (first 300 elements) ====== 265s 100000 requests completed in 4.32 seconds 265s 50 parallel clients 265s 3 bytes payload 265s keep alive: 1 265s host configuration "save": 3600 1 300 100 60 10000 265s host configuration "appendonly": no 265s multi-thread: no 265s 265s Latency by percentile distribution: 265s 0.000% <= 0.383 milliseconds (cumulative count 10) 265s 50.000% <= 10.927 milliseconds (cumulative count 50060) 265s 75.000% <= 15.023 milliseconds (cumulative count 75000) 265s 87.500% <= 18.479 milliseconds (cumulative count 87540) 265s 93.750% <= 20.847 milliseconds (cumulative count 93780) 265s 96.875% <= 22.735 milliseconds (cumulative count 96880) 265s 98.438% <= 23.855 milliseconds (cumulative count 98480) 265s 99.219% <= 24.735 milliseconds (cumulative count 99220) 265s 99.609% <= 25.263 milliseconds (cumulative count 99620) 265s 99.805% <= 25.695 milliseconds (cumulative count 99810) 265s 99.902% <= 25.967 milliseconds (cumulative count 99910) 265s 99.951% <= 26.399 milliseconds (cumulative count 99960) 265s 99.976% <= 26.719 milliseconds (cumulative count 99980) 265s 99.988% <= 26.959 milliseconds (cumulative count 99990) 265s 99.994% <= 27.199 milliseconds (cumulative count 100000) 265s 100.000% <= 27.199 milliseconds (cumulative count 100000) 265s 265s Cumulative distribution of latencies: 265s 0.000% <= 0.103 milliseconds (cumulative count 0) 265s 0.010% <= 0.407 milliseconds (cumulative count 10) 265s 0.020% <= 0.607 milliseconds (cumulative count 20) 265s 0.070% <= 0.807 milliseconds (cumulative count 70) 265s 0.110% <= 0.903 milliseconds (cumulative count 110) 265s 0.130% <= 1.007 milliseconds (cumulative count 130) 265s 0.180% <= 1.103 milliseconds (cumulative count 180) 265s 0.380% <= 1.207 milliseconds (cumulative count 380) 265s 0.470% <= 1.303 milliseconds (cumulative count 470) 265s 0.630% <= 1.407 milliseconds (cumulative count 630) 265s 0.820% <= 1.503 milliseconds (cumulative count 820) 265s 1.000% <= 1.607 milliseconds (cumulative count 1000) 265s 1.200% <= 1.703 milliseconds (cumulative count 1200) 265s 1.460% <= 1.807 milliseconds (cumulative count 1460) 265s 1.670% <= 1.903 milliseconds (cumulative count 1670) 265s 1.900% <= 2.007 milliseconds (cumulative count 1900) 265s 2.100% <= 2.103 milliseconds (cumulative count 2100) 265s 3.660% <= 3.103 milliseconds (cumulative count 3660) 265s 5.280% <= 4.103 milliseconds (cumulative count 5280) 265s 7.370% <= 5.103 milliseconds (cumulative count 7370) 265s 11.140% <= 6.103 milliseconds (cumulative count 11140) 265s 16.360% <= 7.103 milliseconds (cumulative count 16360) 265s 23.770% <= 8.103 milliseconds (cumulative count 23770) 265s 32.690% <= 9.103 milliseconds (cumulative count 32690) 265s 42.310% <= 10.103 milliseconds (cumulative count 42310) 265s 51.640% <= 11.103 milliseconds (cumulative count 51640) 265s 59.800% <= 12.103 milliseconds (cumulative count 59800) 265s 65.880% <= 13.103 milliseconds (cumulative count 65880) 265s 70.800% <= 14.103 milliseconds (cumulative count 70800) 265s 75.330% <= 15.103 milliseconds (cumulative count 75330) 265s 79.150% <= 16.103 milliseconds (cumulative count 79150) 265s 82.740% <= 17.103 milliseconds (cumulative count 82740) 265s 86.350% <= 18.111 milliseconds (cumulative count 86350) 265s 89.410% <= 19.103 milliseconds (cumulative count 89410) 265s 92.010% <= 20.111 milliseconds (cumulative count 92010) 265s 94.320% <= 21.103 milliseconds (cumulative count 94320) 265s 96.030% <= 22.111 milliseconds (cumulative count 96030) 265s 97.370% <= 23.103 milliseconds (cumulative count 97370) 265s 98.730% <= 24.111 milliseconds (cumulative count 98730) 265s 99.510% <= 25.103 milliseconds (cumulative count 99510) 265s 99.910% <= 26.111 milliseconds (cumulative count 99910) 265s 99.990% <= 27.103 milliseconds (cumulative count 99990) 265s 100.000% <= 28.111 milliseconds (cumulative count 100000) 265s 265s Summary: 265s throughput summary: 23121.39 requests per second 265s latency summary (msec): 265s avg min p50 p95 p99 max 265s 11.766 0.376 10.927 21.423 24.415 27.199 273s LRANGE_500 (first 500 elements): rps=5868.5 (overall: 10158.6) avg_msec=23.711 (overall: 23.711) LRANGE_500 (first 500 elements): rps=11649.8 (overall: 11111.9) avg_msec=21.909 (overall: 22.503) LRANGE_500 (first 500 elements): rps=12276.7 (overall: 11561.8) avg_msec=20.254 (overall: 21.581) LRANGE_500 (first 500 elements): rps=10992.2 (overall: 11401.8) avg_msec=23.241 (overall: 22.031) LRANGE_500 (first 500 elements): rps=11273.8 (overall: 11374.0) avg_msec=22.756 (overall: 22.186) LRANGE_500 (first 500 elements): rps=11649.6 (overall: 11423.4) avg_msec=21.576 (overall: 22.075) LRANGE_500 (first 500 elements): rps=12193.0 (overall: 11542.4) avg_msec=21.329 (overall: 21.953) LRANGE_500 (first 500 elements): rps=14702.0 (overall: 11959.6) avg_msec=14.390 (overall: 20.725) LRANGE_500 (first 500 elements): rps=15228.3 (overall: 12339.6) avg_msec=13.305 (overall: 19.661) LRANGE_500 (first 500 elements): rps=14134.9 (overall: 12525.2) avg_msec=17.141 (overall: 19.367) LRANGE_500 (first 500 elements): rps=15329.4 (overall: 12790.9) avg_msec=13.490 (overall: 18.700) LRANGE_500 (first 500 elements): rps=15251.0 (overall: 13006.8) avg_msec=13.618 (overall: 18.177) LRANGE_500 (first 500 elements): rps=14307.1 (overall: 13109.8) avg_msec=16.457 (overall: 18.028) LRANGE_500 (first 500 elements): rps=14660.1 (overall: 13223.3) avg_msec=15.673 (overall: 17.837) LRANGE_500 (first 500 elements): rps=13545.8 (overall: 13245.1) avg_msec=17.467 (overall: 17.811) LRANGE_500 (first 500 elements): rps=14585.9 (overall: 13331.7) avg_msec=16.793 (overall: 17.739) LRANGE_500 (first 500 elements): rps=12200.0 (overall: 13264.5) avg_msec=19.760 (overall: 17.850) LRANGE_500 (first 500 elements): rps=11410.4 (overall: 13160.3) avg_msec=22.653 (overall: 18.084) LRANGE_500 (first 500 elements): rps=10888.4 (overall: 13039.4) avg_msec=23.707 (overall: 18.334) LRANGE_500 (first 500 elements): rps=10878.0 (overall: 12929.0) avg_msec=23.649 (overall: 18.562) LRANGE_500 (first 500 elements): rps=11141.7 (overall: 12842.1) avg_msec=23.334 (overall: 18.763) LRANGE_500 (first 500 elements): rps=11003.9 (overall: 12756.6) avg_msec=23.176 (overall: 18.941) LRANGE_500 (first 500 elements): rps=10827.5 (overall: 12670.8) avg_msec=23.496 (overall: 19.114) LRANGE_500 (first 500 elements): rps=10731.2 (overall: 12588.8) avg_msec=23.935 (overall: 19.287) LRANGE_500 (first 500 elements): rps=11621.6 (overall: 12548.7) avg_msec=22.442 (overall: 19.408) LRANGE_500 (first 500 elements): rps=11406.4 (overall: 12504.6) avg_msec=22.223 (overall: 19.508) LRANGE_500 (first 500 elements): rps=11140.6 (overall: 12452.9) avg_msec=22.928 (overall: 19.624) LRANGE_500 (first 500 elements): rps=10700.0 (overall: 12390.3) avg_msec=24.008 (overall: 19.759) LRANGE_500 (first 500 elements): rps=10968.5 (overall: 12340.6) avg_msec=23.490 (overall: 19.875) LRANGE_500 (first 500 elements): rps=10957.0 (overall: 12293.5) avg_msec=23.836 (overall: 19.995) LRANGE_500 (first 500 elements): rps=11236.4 (overall: 12258.4) avg_msec=23.516 (overall: 20.102) LRANGE_500 (first 500 elements): rps=11625.5 (overall: 12238.0) avg_msec=22.720 (overall: 20.182) ====== LRANGE_500 (first 500 elements) ====== 273s 100000 requests completed in 8.18 seconds 273s 50 parallel clients 273s 3 bytes payload 273s keep alive: 1 273s host configuration "save": 3600 1 300 100 60 10000 273s host configuration "appendonly": no 273s multi-thread: no 273s 273s Latency by percentile distribution: 273s 0.000% <= 0.535 milliseconds (cumulative count 10) 273s 50.000% <= 20.719 milliseconds (cumulative count 50070) 273s 75.000% <= 26.591 milliseconds (cumulative count 75040) 273s 87.500% <= 30.143 milliseconds (cumulative count 87530) 273s 93.750% <= 32.863 milliseconds (cumulative count 93770) 273s 96.875% <= 34.943 milliseconds (cumulative count 96880) 273s 98.438% <= 36.543 milliseconds (cumulative count 98450) 273s 99.219% <= 37.439 milliseconds (cumulative count 99220) 273s 99.609% <= 38.335 milliseconds (cumulative count 99610) 273s 99.805% <= 39.007 milliseconds (cumulative count 99810) 273s 99.902% <= 39.711 milliseconds (cumulative count 99910) 273s 99.951% <= 40.159 milliseconds (cumulative count 99960) 273s 99.976% <= 40.639 milliseconds (cumulative count 99980) 273s 99.988% <= 40.863 milliseconds (cumulative count 99990) 273s 99.994% <= 41.119 milliseconds (cumulative count 100000) 273s 100.000% <= 41.119 milliseconds (cumulative count 100000) 273s 273s Cumulative distribution of latencies: 273s 0.000% <= 0.103 milliseconds (cumulative count 0) 273s 0.010% <= 0.607 milliseconds (cumulative count 10) 273s 0.030% <= 1.103 milliseconds (cumulative count 30) 273s 0.060% <= 1.303 milliseconds (cumulative count 60) 273s 0.120% <= 1.407 milliseconds (cumulative count 120) 273s 0.130% <= 1.503 milliseconds (cumulative count 130) 273s 0.250% <= 1.607 milliseconds (cumulative count 250) 273s 0.410% <= 1.703 milliseconds (cumulative count 410) 273s 0.520% <= 1.807 milliseconds (cumulative count 520) 273s 0.720% <= 1.903 milliseconds (cumulative count 720) 273s 0.990% <= 2.007 milliseconds (cumulative count 990) 273s 1.190% <= 2.103 milliseconds (cumulative count 1190) 273s 3.870% <= 3.103 milliseconds (cumulative count 3870) 273s 5.460% <= 4.103 milliseconds (cumulative count 5460) 273s 6.570% <= 5.103 milliseconds (cumulative count 6570) 273s 7.570% <= 6.103 milliseconds (cumulative count 7570) 273s 8.810% <= 7.103 milliseconds (cumulative count 8810) 273s 10.060% <= 8.103 milliseconds (cumulative count 10060) 273s 11.440% <= 9.103 milliseconds (cumulative count 11440) 273s 13.190% <= 10.103 milliseconds (cumulative count 13190) 273s 16.000% <= 11.103 milliseconds (cumulative count 16000) 273s 18.960% <= 12.103 milliseconds (cumulative count 18960) 273s 21.960% <= 13.103 milliseconds (cumulative count 21960) 273s 24.880% <= 14.103 milliseconds (cumulative count 24880) 273s 27.730% <= 15.103 milliseconds (cumulative count 27730) 273s 30.600% <= 16.103 milliseconds (cumulative count 30600) 273s 33.410% <= 17.103 milliseconds (cumulative count 33410) 273s 37.210% <= 18.111 milliseconds (cumulative count 37210) 273s 41.310% <= 19.103 milliseconds (cumulative count 41310) 273s 46.480% <= 20.111 milliseconds (cumulative count 46480) 273s 52.060% <= 21.103 milliseconds (cumulative count 52060) 273s 56.880% <= 22.111 milliseconds (cumulative count 56880) 273s 61.090% <= 23.103 milliseconds (cumulative count 61090) 273s 64.990% <= 24.111 milliseconds (cumulative count 64990) 273s 68.880% <= 25.103 milliseconds (cumulative count 68880) 273s 73.130% <= 26.111 milliseconds (cumulative count 73130) 273s 76.910% <= 27.103 milliseconds (cumulative count 76910) 273s 80.640% <= 28.111 milliseconds (cumulative count 80640) 273s 84.120% <= 29.103 milliseconds (cumulative count 84120) 273s 87.410% <= 30.111 milliseconds (cumulative count 87410) 273s 90.110% <= 31.103 milliseconds (cumulative count 90110) 273s 92.360% <= 32.111 milliseconds (cumulative count 92360) 273s 94.180% <= 33.119 milliseconds (cumulative count 94180) 273s 95.710% <= 34.111 milliseconds (cumulative count 95710) 273s 97.060% <= 35.103 milliseconds (cumulative count 97060) 273s 98.210% <= 36.127 milliseconds (cumulative count 98210) 273s 98.950% <= 37.119 milliseconds (cumulative count 98950) 273s 99.560% <= 38.111 milliseconds (cumulative count 99560) 273s 99.820% <= 39.103 milliseconds (cumulative count 99820) 273s 99.940% <= 40.127 milliseconds (cumulative count 99940) 273s 100.000% <= 41.119 milliseconds (cumulative count 100000) 273s 273s Summary: 273s throughput summary: 12227.93 requests per second 273s latency summary (msec): 273s avg min p50 p95 p99 max 273s 20.220 0.528 20.719 33.663 37.183 41.119 283s LRANGE_600 (first 600 elements): rps=3217.1 (overall: 7614.7) avg_msec=29.588 (overall: 29.588) LRANGE_600 (first 600 elements): rps=9019.4 (overall: 8602.2) avg_msec=27.759 (overall: 28.240) LRANGE_600 (first 600 elements): rps=9412.7 (overall: 8932.1) avg_msec=26.219 (overall: 27.373) LRANGE_600 (first 600 elements): rps=10960.2 (overall: 9517.2) avg_msec=19.253 (overall: 24.675) LRANGE_600 (first 600 elements): rps=9074.8 (overall: 9417.3) avg_msec=27.707 (overall: 25.336) LRANGE_600 (first 600 elements): rps=9042.6 (overall: 9347.3) avg_msec=28.166 (overall: 25.847) LRANGE_600 (first 600 elements): rps=9366.4 (overall: 9350.4) avg_msec=27.421 (overall: 26.098) LRANGE_600 (first 600 elements): rps=11784.9 (overall: 9672.8) avg_msec=17.560 (overall: 24.720) LRANGE_600 (first 600 elements): rps=9913.7 (overall: 9701.4) avg_msec=24.716 (overall: 24.720) LRANGE_600 (first 600 elements): rps=12213.4 (overall: 9965.9) avg_msec=17.870 (overall: 23.836) LRANGE_600 (first 600 elements): rps=9310.1 (overall: 9902.3) avg_msec=26.563 (overall: 24.084) LRANGE_600 (first 600 elements): rps=8956.9 (overall: 9819.6) avg_msec=28.120 (overall: 24.406) LRANGE_600 (first 600 elements): rps=8693.2 (overall: 9730.3) avg_msec=28.759 (overall: 24.715) LRANGE_600 (first 600 elements): rps=9179.3 (overall: 9689.9) avg_msec=25.623 (overall: 24.778) LRANGE_600 (first 600 elements): rps=8924.3 (overall: 9637.5) avg_msec=27.674 (overall: 24.961) LRANGE_600 (first 600 elements): rps=9426.9 (overall: 9623.6) avg_msec=26.981 (overall: 25.092) LRANGE_600 (first 600 elements): rps=9766.5 (overall: 9632.3) avg_msec=25.382 (overall: 25.110) LRANGE_600 (first 600 elements): rps=8832.7 (overall: 9587.1) avg_msec=28.168 (overall: 25.270) LRANGE_600 (first 600 elements): rps=8844.6 (overall: 9547.4) avg_msec=27.578 (overall: 25.384) LRANGE_600 (first 600 elements): rps=8888.0 (overall: 9514.0) avg_msec=27.653 (overall: 25.491) LRANGE_600 (first 600 elements): rps=8531.7 (overall: 9466.3) avg_msec=27.455 (overall: 25.577) LRANGE_600 (first 600 elements): rps=8884.9 (overall: 9439.4) avg_msec=27.795 (overall: 25.674) LRANGE_600 (first 600 elements): rps=9545.5 (overall: 9444.1) avg_msec=26.997 (overall: 25.733) LRANGE_600 (first 600 elements): rps=12338.6 (overall: 9566.3) avg_msec=16.064 (overall: 25.207) LRANGE_600 (first 600 elements): rps=12687.5 (overall: 9695.1) avg_msec=15.186 (overall: 24.666) LRANGE_600 (first 600 elements): rps=12692.9 (overall: 9813.0) avg_msec=13.870 (overall: 24.116) LRANGE_600 (first 600 elements): rps=12538.8 (overall: 9917.8) avg_msec=15.059 (overall: 23.676) LRANGE_600 (first 600 elements): rps=12537.8 (overall: 10012.2) avg_msec=14.681 (overall: 23.270) LRANGE_600 (first 600 elements): rps=12404.8 (overall: 10095.7) avg_msec=17.424 (overall: 23.019) LRANGE_600 (first 600 elements): rps=9156.0 (overall: 10064.3) avg_msec=26.236 (overall: 23.117) LRANGE_600 (first 600 elements): rps=8685.3 (overall: 10019.4) avg_msec=28.001 (overall: 23.255) LRANGE_600 (first 600 elements): rps=8961.8 (overall: 9984.7) avg_msec=27.934 (overall: 23.393) LRANGE_600 (first 600 elements): rps=8772.9 (overall: 9947.8) avg_msec=28.149 (overall: 23.521) LRANGE_600 (first 600 elements): rps=11112.4 (overall: 9983.2) avg_msec=21.173 (overall: 23.441) LRANGE_600 (first 600 elements): rps=12094.9 (overall: 10044.3) avg_msec=17.241 (overall: 23.225) LRANGE_600 (first 600 elements): rps=10027.0 (overall: 10043.8) avg_msec=22.148 (overall: 23.194) LRANGE_600 (first 600 elements): rps=9284.6 (overall: 10022.5) avg_msec=27.442 (overall: 23.305) LRANGE_600 (first 600 elements): rps=8638.9 (overall: 9985.8) avg_msec=28.593 (overall: 23.426) LRANGE_600 (first 600 elements): rps=9636.0 (overall: 9976.9) avg_msec=24.667 (overall: 23.457) LRANGE_600 (first 600 elements): rps=8704.0 (overall: 9945.1) avg_msec=27.381 (overall: 23.543) ====== LRANGE_600 (first 600 elements) ====== 283s 100000 requests completed in 10.06 seconds 283s 50 parallel clients 283s 3 bytes payload 283s keep alive: 1 283s host configuration "save": 3600 1 300 100 60 10000 283s host configuration "appendonly": no 283s multi-thread: no 283s 283s Latency by percentile distribution: 283s 0.000% <= 0.607 milliseconds (cumulative count 10) 283s 50.000% <= 24.783 milliseconds (cumulative count 50050) 283s 75.000% <= 31.135 milliseconds (cumulative count 75070) 283s 87.500% <= 34.847 milliseconds (cumulative count 87550) 283s 93.750% <= 37.279 milliseconds (cumulative count 93790) 283s 96.875% <= 39.359 milliseconds (cumulative count 96880) 283s 98.438% <= 41.023 milliseconds (cumulative count 98450) 283s 99.219% <= 42.079 milliseconds (cumulative count 99220) 283s 99.609% <= 42.975 milliseconds (cumulative count 99610) 283s 99.805% <= 43.647 milliseconds (cumulative count 99810) 283s 99.902% <= 44.063 milliseconds (cumulative count 99910) 283s 99.951% <= 44.319 milliseconds (cumulative count 99960) 283s 99.976% <= 44.895 milliseconds (cumulative count 99980) 283s 99.988% <= 45.183 milliseconds (cumulative count 99990) 283s 99.994% <= 45.503 milliseconds (cumulative count 100000) 283s 100.000% <= 45.503 milliseconds (cumulative count 100000) 283s 283s Cumulative distribution of latencies: 283s 0.000% <= 0.103 milliseconds (cumulative count 0) 283s 0.010% <= 0.607 milliseconds (cumulative count 10) 283s 0.020% <= 1.007 milliseconds (cumulative count 20) 283s 0.080% <= 1.103 milliseconds (cumulative count 80) 283s 0.130% <= 1.207 milliseconds (cumulative count 130) 283s 0.190% <= 1.303 milliseconds (cumulative count 190) 283s 0.270% <= 1.407 milliseconds (cumulative count 270) 283s 0.400% <= 1.503 milliseconds (cumulative count 400) 283s 0.500% <= 1.607 milliseconds (cumulative count 500) 283s 0.660% <= 1.703 milliseconds (cumulative count 660) 283s 0.800% <= 1.807 milliseconds (cumulative count 800) 283s 0.920% <= 1.903 milliseconds (cumulative count 920) 283s 1.230% <= 2.007 milliseconds (cumulative count 1230) 283s 1.390% <= 2.103 milliseconds (cumulative count 1390) 283s 3.530% <= 3.103 milliseconds (cumulative count 3530) 283s 4.640% <= 4.103 milliseconds (cumulative count 4640) 283s 5.390% <= 5.103 milliseconds (cumulative count 5390) 283s 5.970% <= 6.103 milliseconds (cumulative count 5970) 283s 6.430% <= 7.103 milliseconds (cumulative count 6430) 283s 7.080% <= 8.103 milliseconds (cumulative count 7080) 283s 8.320% <= 9.103 milliseconds (cumulative count 8320) 283s 10.330% <= 10.103 milliseconds (cumulative count 10330) 283s 12.760% <= 11.103 milliseconds (cumulative count 12760) 283s 15.510% <= 12.103 milliseconds (cumulative count 15510) 283s 17.950% <= 13.103 milliseconds (cumulative count 17950) 283s 20.470% <= 14.103 milliseconds (cumulative count 20470) 283s 22.850% <= 15.103 milliseconds (cumulative count 22850) 283s 25.430% <= 16.103 milliseconds (cumulative count 25430) 283s 27.740% <= 17.103 milliseconds (cumulative count 27740) 283s 29.820% <= 18.111 milliseconds (cumulative count 29820) 283s 31.810% <= 19.103 milliseconds (cumulative count 31810) 283s 33.820% <= 20.111 milliseconds (cumulative count 33820) 283s 35.790% <= 21.103 milliseconds (cumulative count 35790) 283s 38.640% <= 22.111 milliseconds (cumulative count 38640) 283s 42.280% <= 23.103 milliseconds (cumulative count 42280) 283s 46.750% <= 24.111 milliseconds (cumulative count 46750) 283s 51.630% <= 25.103 milliseconds (cumulative count 51630) 283s 56.050% <= 26.111 milliseconds (cumulative count 56050) 283s 59.870% <= 27.103 milliseconds (cumulative count 59870) 283s 63.670% <= 28.111 milliseconds (cumulative count 63670) 283s 67.140% <= 29.103 milliseconds (cumulative count 67140) 283s 71.000% <= 30.111 milliseconds (cumulative count 71000) 283s 74.910% <= 31.103 milliseconds (cumulative count 74910) 283s 78.680% <= 32.111 milliseconds (cumulative count 78680) 283s 82.230% <= 33.119 milliseconds (cumulative count 82230) 283s 85.400% <= 34.111 milliseconds (cumulative count 85400) 283s 88.260% <= 35.103 milliseconds (cumulative count 88260) 283s 91.290% <= 36.127 milliseconds (cumulative count 91290) 283s 93.500% <= 37.119 milliseconds (cumulative count 93500) 283s 95.250% <= 38.111 milliseconds (cumulative count 95250) 284s 96.520% <= 39.103 milliseconds (cumulative count 96520) 284s 97.630% <= 40.127 milliseconds (cumulative count 97630) 284s 98.530% <= 41.119 milliseconds (cumulative count 98530) 284s 99.240% <= 42.111 milliseconds (cumulative count 99240) 284s 99.670% <= 43.103 milliseconds (cumulative count 99670) 284s 99.930% <= 44.127 milliseconds (cumulative count 99930) 284s 99.980% <= 45.119 milliseconds (cumulative count 99980) 284s 100.000% <= 46.111 milliseconds (cumulative count 100000) 284s 284s Summary: 284s throughput summary: 9944.31 requests per second 284s latency summary (msec): 284s avg min p50 p95 p99 max 284s 23.556 0.600 24.783 37.951 41.727 45.503 284s MSET (10 keys): rps=143585.7 (overall: 174951.5) avg_msec=2.722 (overall: 2.722) MSET (10 keys): rps=175800.0 (overall: 175416.7) avg_msec=2.719 (overall: 2.721) ====== MSET (10 keys) ====== 284s 100000 requests completed in 0.57 seconds 284s 50 parallel clients 284s 3 bytes payload 284s keep alive: 1 284s host configuration "save": 3600 1 300 100 60 10000 284s host configuration "appendonly": no 284s multi-thread: no 284s 284s Latency by percentile distribution: 284s 0.000% <= 0.447 milliseconds (cumulative count 10) 284s 50.000% <= 2.783 milliseconds (cumulative count 50200) 284s 75.000% <= 2.935 milliseconds (cumulative count 75940) 284s 87.500% <= 3.031 milliseconds (cumulative count 87960) 284s 93.750% <= 3.111 milliseconds (cumulative count 94160) 284s 96.875% <= 3.191 milliseconds (cumulative count 97020) 284s 98.438% <= 3.271 milliseconds (cumulative count 98450) 284s 99.219% <= 3.335 milliseconds (cumulative count 99220) 284s 99.609% <= 3.391 milliseconds (cumulative count 99640) 284s 99.805% <= 3.431 milliseconds (cumulative count 99820) 284s 99.902% <= 3.471 milliseconds (cumulative count 99920) 284s 99.951% <= 3.511 milliseconds (cumulative count 99960) 284s 99.976% <= 3.559 milliseconds (cumulative count 99980) 284s 99.988% <= 3.583 milliseconds (cumulative count 100000) 284s 100.000% <= 3.583 milliseconds (cumulative count 100000) 284s 284s Cumulative distribution of latencies: 284s 0.000% <= 0.103 milliseconds (cumulative count 0) 284s 0.010% <= 0.503 milliseconds (cumulative count 10) 284s 0.020% <= 0.703 milliseconds (cumulative count 20) 284s 0.070% <= 0.807 milliseconds (cumulative count 70) 284s 0.130% <= 0.903 milliseconds (cumulative count 130) 284s 0.160% <= 1.303 milliseconds (cumulative count 160) 284s 0.270% <= 1.407 milliseconds (cumulative count 270) 284s 0.510% <= 1.503 milliseconds (cumulative count 510) 284s 2.620% <= 1.607 milliseconds (cumulative count 2620) 284s 5.480% <= 1.703 milliseconds (cumulative count 5480) 284s 7.510% <= 1.807 milliseconds (cumulative count 7510) 284s 8.000% <= 1.903 milliseconds (cumulative count 8000) 284s 8.170% <= 2.007 milliseconds (cumulative count 8170) 284s 8.230% <= 2.103 milliseconds (cumulative count 8230) 284s 93.710% <= 3.103 milliseconds (cumulative count 93710) 284s 100.000% <= 4.103 milliseconds (cumulative count 100000) 284s 284s Summary: 284s throughput summary: 175746.92 requests per second 284s latency summary (msec): 284s avg min p50 p95 p99 max 284s 2.719 0.440 2.783 3.127 3.319 3.583 284s XADD: rps=157490.0 (overall: 288540.2) avg_msec=1.605 (overall: 1.605) ====== XADD ====== 284s 100000 requests completed in 0.35 seconds 284s 50 parallel clients 284s 3 bytes payload 284s keep alive: 1 284s host configuration "save": 3600 1 300 100 60 10000 284s host configuration "appendonly": no 284s multi-thread: no 284s 284s Latency by percentile distribution: 284s 0.000% <= 0.415 milliseconds (cumulative count 10) 284s 50.000% <= 1.663 milliseconds (cumulative count 50930) 284s 75.000% <= 1.791 milliseconds (cumulative count 75300) 284s 87.500% <= 1.871 milliseconds (cumulative count 87660) 284s 93.750% <= 1.935 milliseconds (cumulative count 94150) 284s 96.875% <= 1.991 milliseconds (cumulative count 97100) 284s 98.438% <= 2.047 milliseconds (cumulative count 98440) 284s 99.219% <= 2.111 milliseconds (cumulative count 99280) 284s 99.609% <= 2.159 milliseconds (cumulative count 99620) 284s 99.805% <= 2.207 milliseconds (cumulative count 99820) 284s 99.902% <= 2.247 milliseconds (cumulative count 99910) 284s 99.951% <= 2.287 milliseconds (cumulative count 99960) 284s 99.976% <= 2.335 milliseconds (cumulative count 99980) 284s 99.988% <= 2.367 milliseconds (cumulative count 99990) 284s 99.994% <= 2.407 milliseconds (cumulative count 100000) 284s 100.000% <= 2.407 milliseconds (cumulative count 100000) 284s 284s Cumulative distribution of latencies: 284s 0.000% <= 0.103 milliseconds (cumulative count 0) 284s 0.070% <= 0.503 milliseconds (cumulative count 70) 284s 0.120% <= 0.607 milliseconds (cumulative count 120) 284s 0.280% <= 0.903 milliseconds (cumulative count 280) 284s 1.650% <= 1.007 milliseconds (cumulative count 1650) 284s 6.930% <= 1.103 milliseconds (cumulative count 6930) 284s 10.480% <= 1.207 milliseconds (cumulative count 10480) 284s 11.340% <= 1.303 milliseconds (cumulative count 11340) 284s 14.610% <= 1.407 milliseconds (cumulative count 14610) 284s 23.610% <= 1.503 milliseconds (cumulative count 23610) 284s 40.260% <= 1.607 milliseconds (cumulative count 40260) 284s 58.710% <= 1.703 milliseconds (cumulative count 58710) 284s 78.130% <= 1.807 milliseconds (cumulative count 78130) 284s 91.370% <= 1.903 milliseconds (cumulative count 91370) 284s 97.590% <= 2.007 milliseconds (cumulative count 97590) 284s 99.210% <= 2.103 milliseconds (cumulative count 99210) 284s 100.000% <= 3.103 milliseconds (cumulative count 100000) 284s 284s Summary: 284s throughput summary: 289017.34 requests per second 284s latency summary (msec): 284s avg min p50 p95 p99 max 284s 1.622 0.408 1.663 1.951 2.087 2.407 284s 285s autopkgtest [18:20:39]: test 0002-benchmark: -----------------------] 289s 0002-benchmark PASS 289s autopkgtest [18:20:43]: test 0002-benchmark: - - - - - - - - - - results - - - - - - - - - - 293s autopkgtest [18:20:47]: test 0003-valkey-check-aof: preparing testbed 295s Reading package lists... 295s Building dependency tree... 295s Reading state information... 295s Starting pkgProblemResolver with broken count: 0 295s Starting 2 pkgProblemResolver with broken count: 0 295s Done 296s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 305s autopkgtest [18:20:59]: test 0003-valkey-check-aof: [----------------------- 308s autopkgtest [18:21:02]: test 0003-valkey-check-aof: -----------------------] 312s 0003-valkey-check-aof PASS 312s autopkgtest [18:21:06]: test 0003-valkey-check-aof: - - - - - - - - - - results - - - - - - - - - - 316s autopkgtest [18:21:10]: test 0004-valkey-check-rdb: preparing testbed 318s Reading package lists... 318s Building dependency tree... 318s Reading state information... 318s Starting pkgProblemResolver with broken count: 0 318s Starting 2 pkgProblemResolver with broken count: 0 318s Done 319s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 327s autopkgtest [18:21:21]: test 0004-valkey-check-rdb: [----------------------- 334s OK 334s [offset 0] Checking RDB file /var/lib/valkey/dump.rdb 334s [offset 27] AUX FIELD valkey-ver = '8.0.2' 334s [offset 41] AUX FIELD redis-bits = '32' 334s [offset 53] AUX FIELD ctime = '1742062888' 334s [offset 68] AUX FIELD used-mem = '2835304' 334s [offset 80] AUX FIELD aof-base = '0' 334s [offset 82] Selecting DB ID 0 334s [offset 566363] Checksum OK 334s [offset 566363] \o/ RDB looks OK! \o/ 334s [info] 5 keys read 334s [info] 0 expires 334s [info] 0 already expired 335s autopkgtest [18:21:29]: test 0004-valkey-check-rdb: -----------------------] 340s autopkgtest [18:21:34]: test 0004-valkey-check-rdb: - - - - - - - - - - results - - - - - - - - - - 340s 0004-valkey-check-rdb PASS 344s autopkgtest [18:21:38]: test 0005-cjson: preparing testbed 345s Reading package lists... 346s Building dependency tree... 346s Reading state information... 346s Starting pkgProblemResolver with broken count: 0 346s Starting 2 pkgProblemResolver with broken count: 0 346s Done 347s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 355s autopkgtest [18:21:49]: test 0005-cjson: [----------------------- 363s 363s autopkgtest [18:21:57]: test 0005-cjson: -----------------------] 368s autopkgtest [18:22:02]: test 0005-cjson: - - - - - - - - - - results - - - - - - - - - - 368s 0005-cjson PASS 372s autopkgtest [18:22:06]: test 0006-migrate-from-redis: preparing testbed 396s autopkgtest [18:22:30]: testbed dpkg architecture: armhf 398s autopkgtest [18:22:32]: testbed apt version: 2.9.33 402s autopkgtest [18:22:36]: @@@@@@@@@@@@@@@@@@@@ test bed setup 404s autopkgtest [18:22:38]: testbed release detected to be: plucky 412s autopkgtest [18:22:46]: updating testbed package index (apt update) 414s Get:1 http://ftpmaster.internal/ubuntu plucky-proposed InRelease [126 kB] 414s Get:2 http://ftpmaster.internal/ubuntu plucky InRelease [257 kB] 414s Get:3 http://ftpmaster.internal/ubuntu plucky-updates InRelease [126 kB] 415s Get:4 http://ftpmaster.internal/ubuntu plucky-security InRelease [126 kB] 415s Get:5 http://ftpmaster.internal/ubuntu plucky-proposed/multiverse Sources [15.8 kB] 415s Get:6 http://ftpmaster.internal/ubuntu plucky-proposed/main Sources [99.7 kB] 415s Get:7 http://ftpmaster.internal/ubuntu plucky-proposed/universe Sources [379 kB] 415s Get:8 http://ftpmaster.internal/ubuntu plucky-proposed/main armhf Packages [114 kB] 415s Get:9 http://ftpmaster.internal/ubuntu plucky-proposed/main armhf c-n-f Metadata [1832 B] 415s Get:10 http://ftpmaster.internal/ubuntu plucky-proposed/restricted armhf c-n-f Metadata [116 B] 415s Get:11 http://ftpmaster.internal/ubuntu plucky-proposed/universe armhf Packages [312 kB] 415s Get:12 http://ftpmaster.internal/ubuntu plucky-proposed/universe armhf c-n-f Metadata [11.1 kB] 415s Get:13 http://ftpmaster.internal/ubuntu plucky-proposed/multiverse armhf Packages [3472 B] 415s Get:14 http://ftpmaster.internal/ubuntu plucky-proposed/multiverse armhf c-n-f Metadata [240 B] 415s Get:15 http://ftpmaster.internal/ubuntu plucky/main Sources [1394 kB] 416s Get:16 http://ftpmaster.internal/ubuntu plucky/multiverse Sources [299 kB] 416s Get:17 http://ftpmaster.internal/ubuntu plucky/universe Sources [21.0 MB] 430s Get:18 http://ftpmaster.internal/ubuntu plucky/main armhf Packages [1378 kB] 431s Get:19 http://ftpmaster.internal/ubuntu plucky/main armhf c-n-f Metadata [29.4 kB] 431s Get:20 http://ftpmaster.internal/ubuntu plucky/restricted armhf c-n-f Metadata [108 B] 431s Get:21 http://ftpmaster.internal/ubuntu plucky/universe armhf Packages [15.1 MB] 439s Get:22 http://ftpmaster.internal/ubuntu plucky/multiverse armhf Packages [172 kB] 441s Fetched 41.0 MB in 27s (1502 kB/s) 442s Reading package lists... 449s autopkgtest [18:23:23]: upgrading testbed (apt dist-upgrade and autopurge) 451s Reading package lists... 451s Building dependency tree... 451s Reading state information... 452s Calculating upgrade...Starting pkgProblemResolver with broken count: 0 452s Starting 2 pkgProblemResolver with broken count: 0 452s Done 452s Entering ResolveByKeep 453s 453s Calculating upgrade... 453s The following packages will be upgraded: 453s libc-bin libc6 locales pinentry-curses python3-jinja2 sos strace 453s 7 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 453s Need to get 8683 kB of archives. 453s After this operation, 23.6 kB of additional disk space will be used. 453s Get:1 http://ftpmaster.internal/ubuntu plucky-proposed/main armhf libc6 armhf 2.41-1ubuntu2 [2932 kB] 455s Get:2 http://ftpmaster.internal/ubuntu plucky-proposed/main armhf libc-bin armhf 2.41-1ubuntu2 [545 kB] 455s Get:3 http://ftpmaster.internal/ubuntu plucky-proposed/main armhf locales all 2.41-1ubuntu2 [4246 kB] 457s Get:4 http://ftpmaster.internal/ubuntu plucky/main armhf strace armhf 6.13+ds-1ubuntu1 [445 kB] 457s Get:5 http://ftpmaster.internal/ubuntu plucky/main armhf pinentry-curses armhf 1.3.1-2ubuntu3 [40.6 kB] 457s Get:6 http://ftpmaster.internal/ubuntu plucky/main armhf python3-jinja2 all 3.1.5-2ubuntu1 [109 kB] 457s Get:7 http://ftpmaster.internal/ubuntu plucky/main armhf sos all 4.9.0-5 [365 kB] 457s Preconfiguring packages ... 458s Fetched 8683 kB in 4s (2280 kB/s) 458s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 64655 files and directories currently installed.) 458s Preparing to unpack .../libc6_2.41-1ubuntu2_armhf.deb ... 458s Unpacking libc6:armhf (2.41-1ubuntu2) over (2.41-1ubuntu1) ... 458s Setting up libc6:armhf (2.41-1ubuntu2) ... 458s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 64655 files and directories currently installed.) 458s Preparing to unpack .../libc-bin_2.41-1ubuntu2_armhf.deb ... 458s Unpacking libc-bin (2.41-1ubuntu2) over (2.41-1ubuntu1) ... 458s Setting up libc-bin (2.41-1ubuntu2) ... 459s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 64655 files and directories currently installed.) 459s Preparing to unpack .../locales_2.41-1ubuntu2_all.deb ... 459s Unpacking locales (2.41-1ubuntu2) over (2.41-1ubuntu1) ... 459s Preparing to unpack .../strace_6.13+ds-1ubuntu1_armhf.deb ... 459s Unpacking strace (6.13+ds-1ubuntu1) over (6.11-0ubuntu1) ... 459s Preparing to unpack .../pinentry-curses_1.3.1-2ubuntu3_armhf.deb ... 459s Unpacking pinentry-curses (1.3.1-2ubuntu3) over (1.3.1-2ubuntu2) ... 459s Preparing to unpack .../python3-jinja2_3.1.5-2ubuntu1_all.deb ... 459s Unpacking python3-jinja2 (3.1.5-2ubuntu1) over (3.1.5-2) ... 459s Preparing to unpack .../archives/sos_4.9.0-5_all.deb ... 460s Unpacking sos (4.9.0-5) over (4.9.0-4) ... 460s Setting up sos (4.9.0-5) ... 460s Setting up pinentry-curses (1.3.1-2ubuntu3) ... 460s Setting up locales (2.41-1ubuntu2) ... 461s Generating locales (this might take a while)... 463s en_US.UTF-8... done 463s Generation complete. 463s Setting up python3-jinja2 (3.1.5-2ubuntu1) ... 463s Setting up strace (6.13+ds-1ubuntu1) ... 463s Processing triggers for man-db (2.13.0-1) ... 465s Processing triggers for systemd (257.3-1ubuntu3) ... 466s Reading package lists... 467s Building dependency tree... 467s Reading state information... 467s Starting pkgProblemResolver with broken count: 0 467s Starting 2 pkgProblemResolver with broken count: 0 467s Done 467s Solving dependencies... 468s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 470s autopkgtest [18:23:44]: rebooting testbed after setup commands that affected boot 552s Reading package lists... 553s Building dependency tree... 553s Reading state information... 553s Starting pkgProblemResolver with broken count: 0 554s Starting 2 pkgProblemResolver with broken count: 0 554s Done 556s The following NEW packages will be installed: 556s liblzf1 redis-sentinel redis-server redis-tools 556s 0 upgraded, 4 newly installed, 0 to remove and 0 not upgraded. 556s Need to get 1008 kB of archives. 556s After this operation, 4280 kB of additional disk space will be used. 556s Get:1 http://ftpmaster.internal/ubuntu plucky/universe armhf liblzf1 armhf 3.6-4 [6554 B] 556s Get:2 http://ftpmaster.internal/ubuntu plucky/universe armhf redis-tools armhf 5:7.0.15-3 [937 kB] 557s Get:3 http://ftpmaster.internal/ubuntu plucky/universe armhf redis-sentinel armhf 5:7.0.15-3 [12.2 kB] 557s Get:4 http://ftpmaster.internal/ubuntu plucky/universe armhf redis-server armhf 5:7.0.15-3 [51.7 kB] 558s Fetched 1008 kB in 1s (765 kB/s) 558s Selecting previously unselected package liblzf1:armhf. 558s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 64655 files and directories currently installed.) 558s Preparing to unpack .../liblzf1_3.6-4_armhf.deb ... 558s Unpacking liblzf1:armhf (3.6-4) ... 558s Selecting previously unselected package redis-tools. 558s Preparing to unpack .../redis-tools_5%3a7.0.15-3_armhf.deb ... 558s Unpacking redis-tools (5:7.0.15-3) ... 558s Selecting previously unselected package redis-sentinel. 558s Preparing to unpack .../redis-sentinel_5%3a7.0.15-3_armhf.deb ... 558s Unpacking redis-sentinel (5:7.0.15-3) ... 558s Selecting previously unselected package redis-server. 558s Preparing to unpack .../redis-server_5%3a7.0.15-3_armhf.deb ... 558s Unpacking redis-server (5:7.0.15-3) ... 558s Setting up liblzf1:armhf (3.6-4) ... 558s Setting up redis-tools (5:7.0.15-3) ... 558s Setting up redis-server (5:7.0.15-3) ... 559s Created symlink '/etc/systemd/system/redis.service' → '/usr/lib/systemd/system/redis-server.service'. 559s Created symlink '/etc/systemd/system/multi-user.target.wants/redis-server.service' → '/usr/lib/systemd/system/redis-server.service'. 559s Setting up redis-sentinel (5:7.0.15-3) ... 560s Created symlink '/etc/systemd/system/sentinel.service' → '/usr/lib/systemd/system/redis-sentinel.service'. 560s Created symlink '/etc/systemd/system/multi-user.target.wants/redis-sentinel.service' → '/usr/lib/systemd/system/redis-sentinel.service'. 560s Processing triggers for man-db (2.13.0-1) ... 561s Processing triggers for libc-bin (2.41-1ubuntu2) ... 581s autopkgtest [18:25:35]: test 0006-migrate-from-redis: [----------------------- 583s + FLAG_FILE=/etc/valkey/REDIS_MIGRATION 583s + sed -i 's#loglevel notice#loglevel debug#' /etc/redis/redis.conf 583s + systemctl restart redis-server 583s + redis-cli -h 127.0.0.1 -p 6379 SET test 1 583s + redis-cli -h 127.0.0.1 -p 6379 GET test 583s OK 583s + redis-cli -h 127.0.0.1 -p 6379 SAVE 583s 1 583s OK 583s + sha256sum /var/lib/redis/dump.rdb 583s + apt-get install -y valkey-redis-compat 583s e4796588a19186cd00e1cc699e41b35577ecf282b31dfbd7a3cce93d6155d66e /var/lib/redis/dump.rdb 584s Reading package lists... 584s Building dependency tree... 584s Reading state information... 585s Solving dependencies... 586s The following additional packages will be installed: 586s valkey-server valkey-tools 586s Suggested packages: 586s ruby-redis 586s The following packages will be REMOVED: 586s redis-sentinel redis-server redis-tools 586s The following NEW packages will be installed: 586s valkey-redis-compat valkey-server valkey-tools 586s 0 upgraded, 3 newly installed, 3 to remove and 0 not upgraded. 586s Need to get 1226 kB of archives. 586s After this operation, 721 kB of additional disk space will be used. 586s Get:1 http://ftpmaster.internal/ubuntu plucky/universe armhf valkey-tools armhf 8.0.2+dfsg1-1ubuntu1 [1170 kB] 588s Get:2 http://ftpmaster.internal/ubuntu plucky/universe armhf valkey-server armhf 8.0.2+dfsg1-1ubuntu1 [48.5 kB] 588s Get:3 http://ftpmaster.internal/ubuntu plucky/universe armhf valkey-redis-compat all 8.0.2+dfsg1-1ubuntu1 [7744 B] 588s Fetched 1226 kB in 1s (908 kB/s) 588s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 64706 files and directories currently installed.) 588s Removing redis-sentinel (5:7.0.15-3) ... 589s Removing redis-server (5:7.0.15-3) ... 590s Removing redis-tools (5:7.0.15-3) ... 590s Selecting previously unselected package valkey-tools. 590s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 64669 files and directories currently installed.) 590s Preparing to unpack .../valkey-tools_8.0.2+dfsg1-1ubuntu1_armhf.deb ... 590s Unpacking valkey-tools (8.0.2+dfsg1-1ubuntu1) ... 590s Selecting previously unselected package valkey-server. 590s Preparing to unpack .../valkey-server_8.0.2+dfsg1-1ubuntu1_armhf.deb ... 590s Unpacking valkey-server (8.0.2+dfsg1-1ubuntu1) ... 590s Selecting previously unselected package valkey-redis-compat. 590s Preparing to unpack .../valkey-redis-compat_8.0.2+dfsg1-1ubuntu1_all.deb ... 590s Unpacking valkey-redis-compat (8.0.2+dfsg1-1ubuntu1) ... 590s Setting up valkey-tools (8.0.2+dfsg1-1ubuntu1) ... 590s Setting up valkey-server (8.0.2+dfsg1-1ubuntu1) ... 591s Created symlink '/etc/systemd/system/valkey.service' → '/usr/lib/systemd/system/valkey-server.service'. 591s Created symlink '/etc/systemd/system/multi-user.target.wants/valkey-server.service' → '/usr/lib/systemd/system/valkey-server.service'. 591s Setting up valkey-redis-compat (8.0.2+dfsg1-1ubuntu1) ... 591s dpkg-query: no packages found matching valkey-sentinel 591s [I] /etc/redis/redis.conf has been copied to /etc/valkey/valkey.conf. Please, review the content of valkey.conf, especially if you had modified redis.conf. 591s [I] /etc/redis/sentinel.conf has been copied to /etc/valkey/sentinel.conf. Please, review the content of sentinel.conf, especially if you had modified sentinel.conf. 591s [I] On-disk redis dumps moved from /var/lib/redis/ to /var/lib/valkey. 591s Processing triggers for man-db (2.13.0-1) ... 592s + '[' -f /etc/valkey/REDIS_MIGRATION ']' 592s + sha256sum /var/lib/valkey/dump.rdb 592s cce2491b5941db8655627cb874f0f7a1169175f4654ad024130d4637c485c803 /var/lib/valkey/dump.rdb 592s + systemctl status valkey-server 592s + grep inactive 592s Active: inactive (dead) since Sat 2025-03-15 18:25:45 UTC; 684ms ago 592s + rm /etc/valkey/REDIS_MIGRATION 592s + systemctl start valkey-server 592s + systemctl status valkey-server 592s + grep running 592s Active: active (running) since Sat 2025-03-15 18:25:46 UTC; 19ms ago 592s + sha256sum /var/lib/valkey/dump.rdb 592s cce2491b5941db8655627cb874f0f7a1169175f4654ad024130d4637c485c803 /var/lib/valkey/dump.rdb 592s + cat /etc/valkey/valkey.conf 592s + grep loglevel 592s + grep debug 592s loglevel debug 592s + valkey-cli -h 127.0.0.1 -p 6379 GET test 592s + grep 1 592s 1 593s autopkgtest [18:25:47]: test 0006-migrate-from-redis: -----------------------] 597s autopkgtest [18:25:51]: test 0006-migrate-from-redis: - - - - - - - - - - results - - - - - - - - - - 597s 0006-migrate-from-redis PASS 601s autopkgtest [18:25:55]: @@@@@@@@@@@@@@@@@@@@ summary 601s 0001-valkey-cli PASS 601s 0002-benchmark PASS 601s 0003-valkey-check-aof PASS 601s 0004-valkey-check-rdb PASS 601s 0005-cjson PASS 601s 0006-migrate-from-redis PASS