0s autopkgtest [17:17:47]: starting date and time: 2025-11-04 17:17:47+0000 0s autopkgtest [17:17:47]: git checkout: 4b346b80 nova: make wait_reboot return success even when a no-op 0s autopkgtest [17:17:47]: host juju-7f2275-prod-proposed-migration-environment-2; command line: /home/ubuntu/autopkgtest/runner/autopkgtest --output-dir /tmp/autopkgtest-work.ja86af4a/out --timeout-copy=6000 --setup-commands /home/ubuntu/autopkgtest-cloud/worker-config-production/setup-canonical.sh --apt-pocket=proposed=src:systemd --apt-upgrade redict --timeout-short=300 --timeout-copy=20000 --timeout-build=20000 --env=ADT_TEST_TRIGGERS=systemd/258.1-2ubuntu1 -- ssh -s /home/ubuntu/autopkgtest/ssh-setup/nova -- --flavor autopkgtest-s390x --security-groups autopkgtest-juju-7f2275-prod-proposed-migration-environment-2@bos03-s390x-3.secgroup --name adt-resolute-s390x-redict-20251104-171747-juju-7f2275-prod-proposed-migration-environment-2-bee6b2c9-9b91-4acd-9ef6-48cb3944d8f0 --image adt/ubuntu-resolute-s390x-server --keyname testbed-juju-7f2275-prod-proposed-migration-environment-2 --net-id=net_prod-proposed-migration-s390x -e TERM=linux --mirror=http://ftpmaster.internal/ubuntu/ 3s Creating nova instance adt-resolute-s390x-redict-20251104-171747-juju-7f2275-prod-proposed-migration-environment-2-bee6b2c9-9b91-4acd-9ef6-48cb3944d8f0 from image adt/ubuntu-resolute-s390x-server-20251104.img (UUID 80285f34-81f0-4ef3-9458-279742a38e4e)... 65s autopkgtest [17:18:52]: testbed dpkg architecture: s390x 65s autopkgtest [17:18:52]: testbed apt version: 3.1.11 65s autopkgtest [17:18:52]: @@@@@@@@@@@@@@@@@@@@ test bed setup 66s autopkgtest [17:18:53]: testbed release detected to be: None 66s autopkgtest [17:18:53]: updating testbed package index (apt update) 67s Get:1 http://ftpmaster.internal/ubuntu resolute-proposed InRelease [87.8 kB] 67s Hit:2 http://ftpmaster.internal/ubuntu resolute InRelease 67s Hit:3 http://ftpmaster.internal/ubuntu resolute-updates InRelease 67s Hit:4 http://ftpmaster.internal/ubuntu resolute-security InRelease 67s Get:5 http://ftpmaster.internal/ubuntu resolute-proposed/universe Sources [1009 kB] 68s Get:6 http://ftpmaster.internal/ubuntu resolute-proposed/main Sources [81.9 kB] 68s Get:7 http://ftpmaster.internal/ubuntu resolute-proposed/multiverse Sources [23.7 kB] 68s Get:8 http://ftpmaster.internal/ubuntu resolute-proposed/restricted Sources [9848 B] 68s Get:9 http://ftpmaster.internal/ubuntu resolute-proposed/main s390x Packages [119 kB] 68s Get:10 http://ftpmaster.internal/ubuntu resolute-proposed/restricted s390x Packages [940 B] 68s Get:11 http://ftpmaster.internal/ubuntu resolute-proposed/universe s390x Packages [619 kB] 68s Get:12 http://ftpmaster.internal/ubuntu resolute-proposed/multiverse s390x Packages [11.6 kB] 68s Fetched 1963 kB in 1s (1526 kB/s) 69s Reading package lists... 69s Hit:1 http://ftpmaster.internal/ubuntu resolute-proposed InRelease 69s Hit:2 http://ftpmaster.internal/ubuntu resolute InRelease 69s Hit:3 http://ftpmaster.internal/ubuntu resolute-updates InRelease 69s Hit:4 http://ftpmaster.internal/ubuntu resolute-security InRelease 70s Reading package lists... 70s Reading package lists... 70s Building dependency tree... 70s Reading state information... 70s Calculating upgrade... 70s The following packages will be upgraded: 70s base-passwd bash-completion iputils-tracepath liblz4-1 libnss-systemd 70s libpam-systemd libsystemd-shared libsystemd0 libudev1 libxkbcommon0 systemd 70s systemd-cryptsetup systemd-resolved systemd-sysv udev 71s 15 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 71s Need to get 9192 kB of archives. 71s After this operation, 1103 kB of additional disk space will be used. 71s Get:1 http://ftpmaster.internal/ubuntu resolute/main s390x base-passwd s390x 3.6.8 [54.9 kB] 71s Get:2 http://ftpmaster.internal/ubuntu resolute-proposed/main s390x libsystemd0 s390x 258.1-2ubuntu1 [547 kB] 71s Get:3 http://ftpmaster.internal/ubuntu resolute-proposed/main s390x systemd-sysv s390x 258.1-2ubuntu1 [9254 B] 71s Get:4 http://ftpmaster.internal/ubuntu resolute-proposed/main s390x systemd-resolved s390x 258.1-2ubuntu1 [345 kB] 71s Get:5 http://ftpmaster.internal/ubuntu resolute-proposed/main s390x libnss-systemd s390x 258.1-2ubuntu1 [188 kB] 71s Get:6 http://ftpmaster.internal/ubuntu resolute-proposed/main s390x libpam-systemd s390x 258.1-2ubuntu1 [275 kB] 71s Get:7 http://ftpmaster.internal/ubuntu resolute-proposed/main s390x libsystemd-shared s390x 258.1-2ubuntu1 [2395 kB] 72s Get:8 http://ftpmaster.internal/ubuntu resolute-proposed/main s390x systemd s390x 258.1-2ubuntu1 [3092 kB] 73s Get:9 http://ftpmaster.internal/ubuntu resolute-proposed/main s390x systemd-cryptsetup s390x 258.1-2ubuntu1 [126 kB] 73s Get:10 http://ftpmaster.internal/ubuntu resolute-proposed/main s390x udev s390x 258.1-2ubuntu1 [1470 kB] 73s Get:11 http://ftpmaster.internal/ubuntu resolute-proposed/main s390x libudev1 s390x 258.1-2ubuntu1 [213 kB] 73s Get:12 http://ftpmaster.internal/ubuntu resolute/main s390x liblz4-1 s390x 1.10.0-6 [84.6 kB] 73s Get:13 http://ftpmaster.internal/ubuntu resolute/main s390x bash-completion all 1:2.16.0-8 [214 kB] 73s Get:14 http://ftpmaster.internal/ubuntu resolute/main s390x iputils-tracepath s390x 3:20250605-1ubuntu1 [14.8 kB] 73s Get:15 http://ftpmaster.internal/ubuntu resolute/main s390x libxkbcommon0 s390x 1.12.3-1 [163 kB] 73s dpkg-preconfigure: unable to re-open stdin: No such file or directory 73s Fetched 9192 kB in 3s (3300 kB/s) 74s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 56953 files and directories currently installed.) 74s Preparing to unpack .../base-passwd_3.6.8_s390x.deb ... 74s Unpacking base-passwd (3.6.8) over (3.6.7) ... 74s Setting up base-passwd (3.6.8) ... 74s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 56953 files and directories currently installed.) 74s Preparing to unpack .../libsystemd0_258.1-2ubuntu1_s390x.deb ... 74s Unpacking libsystemd0:s390x (258.1-2ubuntu1) over (257.9-0ubuntu2) ... 74s Setting up libsystemd0:s390x (258.1-2ubuntu1) ... 74s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 56953 files and directories currently installed.) 74s Preparing to unpack .../systemd-sysv_258.1-2ubuntu1_s390x.deb ... 74s Unpacking systemd-sysv (258.1-2ubuntu1) over (257.9-0ubuntu2) ... 74s Preparing to unpack .../systemd-resolved_258.1-2ubuntu1_s390x.deb ... 74s Unpacking systemd-resolved (258.1-2ubuntu1) over (257.9-0ubuntu2) ... 74s Preparing to unpack .../libnss-systemd_258.1-2ubuntu1_s390x.deb ... 74s Unpacking libnss-systemd:s390x (258.1-2ubuntu1) over (257.9-0ubuntu2) ... 74s Preparing to unpack .../libpam-systemd_258.1-2ubuntu1_s390x.deb ... 74s Unpacking libpam-systemd:s390x (258.1-2ubuntu1) over (257.9-0ubuntu2) ... 74s Preparing to unpack .../libsystemd-shared_258.1-2ubuntu1_s390x.deb ... 74s Unpacking libsystemd-shared:s390x (258.1-2ubuntu1) over (257.9-0ubuntu2) ... 74s Setting up libsystemd-shared:s390x (258.1-2ubuntu1) ... 74s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 56952 files and directories currently installed.) 74s Preparing to unpack .../systemd_258.1-2ubuntu1_s390x.deb ... 74s Unpacking systemd (258.1-2ubuntu1) over (257.9-0ubuntu2) ... 74s Preparing to unpack .../systemd-cryptsetup_258.1-2ubuntu1_s390x.deb ... 74s Unpacking systemd-cryptsetup (258.1-2ubuntu1) over (257.9-0ubuntu2) ... 74s Preparing to unpack .../udev_258.1-2ubuntu1_s390x.deb ... 74s Unpacking udev (258.1-2ubuntu1) over (257.9-0ubuntu2) ... 74s Preparing to unpack .../libudev1_258.1-2ubuntu1_s390x.deb ... 74s Unpacking libudev1:s390x (258.1-2ubuntu1) over (257.9-0ubuntu2) ... 75s Setting up libudev1:s390x (258.1-2ubuntu1) ... 75s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 57002 files and directories currently installed.) 75s Preparing to unpack .../liblz4-1_1.10.0-6_s390x.deb ... 75s Unpacking liblz4-1:s390x (1.10.0-6) over (1.10.0-4build1) ... 75s Preparing to unpack .../bash-completion_1%3a2.16.0-8_all.deb ... 75s Unpacking bash-completion (1:2.16.0-8) over (1:2.16.0-7) ... 75s Preparing to unpack .../iputils-tracepath_3%3a20250605-1ubuntu1_s390x.deb ... 75s Unpacking iputils-tracepath (3:20250605-1ubuntu1) over (3:20240905-3ubuntu3) ... 75s Preparing to unpack .../libxkbcommon0_1.12.3-1_s390x.deb ... 75s Unpacking libxkbcommon0:s390x (1.12.3-1) over (1.7.0-2.1) ... 75s Setting up liblz4-1:s390x (1.10.0-6) ... 75s Setting up systemd (258.1-2ubuntu1) ... 75s Installing new version of config file /etc/systemd/logind.conf ... 75s Installing new version of config file /etc/systemd/networkd.conf ... 75s Installing new version of config file /etc/systemd/system.conf ... 75s Installing new version of config file /etc/systemd/user.conf ... 75s /usr/lib/tmpfiles.d/legacy.conf:14: Duplicate line for path "/run/lock", ignoring. 75s /usr/lib/tmpfiles.d/legacy.conf:14: Duplicate line for path "/run/lock", ignoring. 76s Setting up systemd-cryptsetup (258.1-2ubuntu1) ... 76s Setting up bash-completion (1:2.16.0-8) ... 76s Setting up udev (258.1-2ubuntu1) ... 76s Setting up iputils-tracepath (3:20250605-1ubuntu1) ... 76s Setting up systemd-resolved (258.1-2ubuntu1) ... 76s Installing new version of config file /etc/systemd/resolved.conf ... 76s Created symlink '/etc/systemd/system/sockets.target.wants/systemd-resolved-monitor.socket' → '/usr/lib/systemd/system/systemd-resolved-monitor.socket'. 77s Created symlink '/etc/systemd/system/sockets.target.wants/systemd-resolved-varlink.socket' → '/usr/lib/systemd/system/systemd-resolved-varlink.socket'. 77s Could not execute systemctl: at /usr/bin/deb-systemd-invoke line 148. 77s Setting up libxkbcommon0:s390x (1.12.3-1) ... 77s Setting up systemd-sysv (258.1-2ubuntu1) ... 77s Setting up libnss-systemd:s390x (258.1-2ubuntu1) ... 77s Setting up libpam-systemd:s390x (258.1-2ubuntu1) ... 77s Processing triggers for libc-bin (2.42-0ubuntu3) ... 77s Processing triggers for man-db (2.13.1-1) ... 78s Processing triggers for dbus (1.16.2-2ubuntu2) ... 78s Processing triggers for shared-mime-info (2.4-5build2) ... 79s Processing triggers for procps (2:4.0.4-8ubuntu3) ... 79s Processing triggers for initramfs-tools (0.150ubuntu4) ... 79s update-initramfs: Generating /boot/initrd.img-6.17.0-5-generic 82s Using config file '/etc/zipl.conf' 82s Building bootmap in '/boot' 82s Adding IPL section 'ubuntu' (default) 82s Preparing boot device for LD-IPL: vda (0000). 82s Done. 83s autopkgtest [17:19:10]: upgrading testbed (apt dist-upgrade and autopurge) 83s Reading package lists... 83s Building dependency tree... 83s Reading state information... 83s Calculating upgrade... 83s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 83s Reading package lists... 83s Building dependency tree... 83s Reading state information... 84s Solving dependencies... 84s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 84s autopkgtest [17:19:11]: rebooting testbed after setup commands that affected boot 99s autopkgtest [17:19:26]: testbed running kernel: Linux 6.17.0-5-generic #5-Ubuntu SMP Mon Sep 22 08:56:47 UTC 2025 102s autopkgtest [17:19:29]: @@@@@@@@@@@@@@@@@@@@ apt-source redict 106s Get:1 http://ftpmaster.internal/ubuntu resolute/universe redict 7.3.5+ds-1ubuntu0.1 (dsc) [2540 B] 106s Get:2 http://ftpmaster.internal/ubuntu resolute/universe redict 7.3.5+ds-1ubuntu0.1 (tar) [1743 kB] 106s Get:3 http://ftpmaster.internal/ubuntu resolute/universe redict 7.3.5+ds-1ubuntu0.1 (diff) [15.2 kB] 106s gpgv: Signature made Wed Oct 15 11:59:46 2025 UTC 106s gpgv: using RSA key 71FBF17BE7E52D0C2A2C9144F0B61460B04C4B56 106s gpgv: issuer "sudhakar.verma@canonical.com" 106s gpgv: Can't check signature: No public key 106s dpkg-source: warning: cannot verify inline signature for ./redict_7.3.5+ds-1ubuntu0.1.dsc: no acceptable signature found 106s autopkgtest [17:19:33]: testing package redict version 7.3.5+ds-1ubuntu0.1 107s autopkgtest [17:19:34]: build not needed 113s autopkgtest [17:19:40]: test 0001-redict-cli: preparing testbed 113s Reading package lists... 113s Building dependency tree... 113s Reading state information... 113s Solving dependencies... 113s The following NEW packages will be installed: 113s libhiredict1.3.1 liblzf1 redict redict-sentinel redict-server redict-tools 113s 0 upgraded, 6 newly installed, 0 to remove and 0 not upgraded. 113s Need to get 1319 kB of archives. 113s After this operation, 7261 kB of additional disk space will be used. 113s Get:1 http://ftpmaster.internal/ubuntu resolute/universe s390x libhiredict1.3.1 s390x 1.3.1-2 [41.0 kB] 114s Get:2 http://ftpmaster.internal/ubuntu resolute/universe s390x liblzf1 s390x 3.6-4 [7020 B] 114s Get:3 http://ftpmaster.internal/ubuntu resolute/universe s390x redict-tools s390x 7.3.5+ds-1ubuntu0.1 [1213 kB] 115s Get:4 http://ftpmaster.internal/ubuntu resolute/universe s390x redict-sentinel s390x 7.3.5+ds-1ubuntu0.1 [12.6 kB] 115s Get:5 http://ftpmaster.internal/ubuntu resolute/universe s390x redict-server s390x 7.3.5+ds-1ubuntu0.1 [41.3 kB] 115s Get:6 http://ftpmaster.internal/ubuntu resolute/universe s390x redict all 7.3.5+ds-1ubuntu0.1 [3716 B] 115s Fetched 1319 kB in 2s (878 kB/s) 115s Selecting previously unselected package libhiredict1.3.1:s390x. 115s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 57002 files and directories currently installed.) 115s Preparing to unpack .../0-libhiredict1.3.1_1.3.1-2_s390x.deb ... 115s Unpacking libhiredict1.3.1:s390x (1.3.1-2) ... 115s Selecting previously unselected package liblzf1:s390x. 115s Preparing to unpack .../1-liblzf1_3.6-4_s390x.deb ... 115s Unpacking liblzf1:s390x (3.6-4) ... 115s Selecting previously unselected package redict-tools. 115s Preparing to unpack .../2-redict-tools_7.3.5+ds-1ubuntu0.1_s390x.deb ... 115s Unpacking redict-tools (7.3.5+ds-1ubuntu0.1) ... 115s Selecting previously unselected package redict-sentinel. 115s Preparing to unpack .../3-redict-sentinel_7.3.5+ds-1ubuntu0.1_s390x.deb ... 115s Unpacking redict-sentinel (7.3.5+ds-1ubuntu0.1) ... 115s Selecting previously unselected package redict-server. 115s Preparing to unpack .../4-redict-server_7.3.5+ds-1ubuntu0.1_s390x.deb ... 115s Unpacking redict-server (7.3.5+ds-1ubuntu0.1) ... 115s Selecting previously unselected package redict. 115s Preparing to unpack .../5-redict_7.3.5+ds-1ubuntu0.1_all.deb ... 115s Unpacking redict (7.3.5+ds-1ubuntu0.1) ... 115s Setting up liblzf1:s390x (3.6-4) ... 115s Setting up libhiredict1.3.1:s390x (1.3.1-2) ... 115s Setting up redict-tools (7.3.5+ds-1ubuntu0.1) ... 115s Creating group 'redict' with GID 986. 115s Creating user 'redict' (Redict Key/Value Store) with UID 986 and GID 986. 115s Setting up redict-server (7.3.5+ds-1ubuntu0.1) ... 116s Created symlink '/etc/systemd/system/redict.service' → '/usr/lib/systemd/system/redict-server.service'. 116s Created symlink '/etc/systemd/system/multi-user.target.wants/redict-server.service' → '/usr/lib/systemd/system/redict-server.service'. 116s Setting up redict-sentinel (7.3.5+ds-1ubuntu0.1) ... 116s Created symlink '/etc/systemd/system/sentinel.service' → '/usr/lib/systemd/system/redict-sentinel.service'. 116s Created symlink '/etc/systemd/system/multi-user.target.wants/redict-sentinel.service' → '/usr/lib/systemd/system/redict-sentinel.service'. 116s Setting up redict (7.3.5+ds-1ubuntu0.1) ... 116s Processing triggers for libc-bin (2.42-0ubuntu3) ... 118s autopkgtest [17:19:45]: test 0001-redict-cli: [----------------------- 124s # Server 124s redict_version:7.3.5 124s redict_git_sha1:00000000 124s redict_git_dirty:0 124s redict_build_id:5a4bf507d8959e52 124s redict_mode:standalone 124s redis_version:7.2.4 124s os:Linux 6.17.0-5-generic s390x 124s arch_bits:64 124s monotonic_clock:POSIX clock_gettime 124s multiplexing_api:epoll 124s atomicvar_api:c11-builtin 124s gcc_version:15.2.0 124s process_id:1799 124s process_supervised:systemd 124s run_id:f9b8eeed88647b15936cab9ba4a2c290fbcbd9a2 124s tcp_port:6379 124s server_time_usec:1762276790985032 124s uptime_in_seconds:5 124s uptime_in_days:0 124s hz:10 124s configured_hz:10 124s lru_clock:669110 124s executable:/usr/bin/redict-server 124s config_file:/etc/redict/redict.conf 124s io_threads_active:0 124s listener0:name=tcp,bind=127.0.0.1,bind=-::1,port=6379 124s 124s # Clients 124s connected_clients:3 124s cluster_connections:0 124s maxclients:10000 124s client_recent_max_input_buffer:20480 124s client_recent_max_output_buffer:0 124s blocked_clients:0 124s tracking_clients:0 124s pubsub_clients:1 124s watching_clients:0 124s clients_in_timeout_table:0 124s total_watched_keys:0 124s total_blocking_keys:0 124s total_blocking_keys_on_nokey:0 124s 124s # Memory 124s used_memory:1124176 124s used_memory_human:1.07M 124s used_memory_rss:14417920 124s used_memory_rss_human:13.75M 124s used_memory_peak:1124176 124s used_memory_peak_human:1.07M 124s used_memory_peak_perc:102.02% 124s used_memory_overhead:984128 124s used_memory_startup:939072 124s used_memory_dataset:140048 124s used_memory_dataset_perc:75.66% 124s allocator_allocated:4809120 124s allocator_active:9633792 124s allocator_resident:11993088 124s allocator_muzzy:0 124s total_system_memory:4189937664 124s total_system_memory_human:3.90G 124s used_memory_lua:31744 124s used_memory_vm_eval:31744 124s used_memory_lua_human:31.00K 124s used_memory_scripts_eval:0 124s number_of_cached_scripts:0 124s number_of_functions:0 124s number_of_libraries:0 124s used_memory_vm_functions:33792 124s used_memory_vm_total:65536 124s used_memory_vm_total_human:64.00K 124s used_memory_functions:200 124s used_memory_scripts:200 124s used_memory_scripts_human:200B 124s maxmemory:0 124s maxmemory_human:0B 124s maxmemory_policy:noeviction 124s allocator_frag_ratio:1.99 124s allocator_frag_bytes:4759136 124s allocator_rss_ratio:1.24 124s allocator_rss_bytes:2359296 124s rss_overhead_ratio:1.20 124s rss_overhead_bytes:2424832 124s mem_fragmentation_ratio:13.30 124s mem_fragmentation_bytes:13333720 124s mem_not_counted_for_evict:0 124s mem_replication_backlog:0 124s mem_total_replication_buffers:0 124s mem_clients_slaves:0 124s mem_clients_normal:44856 124s mem_cluster_links:0 124s mem_aof_buffer:0 124s mem_allocator:jemalloc-5.3.0 124s mem_overhead_db_hashtable_rehashing:0 124s active_defrag_running:0 124s lazyfree_pending_objects:0 124s lazyfreed_objects:0 124s 124s # Persistence 124s loading:0 124s async_loading:0 124s current_cow_peak:0 124s current_cow_size:0 124s current_cow_size_age:0 124s current_fork_perc:0.00 124s current_save_keys_processed:0 124s current_save_keys_total:0 124s rdb_changes_since_last_save:0 124s rdb_bgsave_in_progress:0 124s rdb_last_save_time:1762276785 124s rdb_last_bgsave_status:ok 124s rdb_last_bgsave_time_sec:-1 124s rdb_current_bgsave_time_sec:-1 124s rdb_saves:0 124s rdb_last_cow_size:0 124s rdb_last_load_keys_expired:0 124s rdb_last_load_keys_loaded:0 124s aof_enabled:0 124s aof_rewrite_in_progress:0 124s aof_rewrite_scheduled:0 124s aof_last_rewrite_time_sec:-1 124s aof_current_rewrite_time_sec:-1 124s aof_last_bgrewrite_status:ok 124s aof_rewrites:0 124s aof_rewrites_consecutive_failures:0 124s aof_last_write_status:ok 124s aof_last_cow_size:0 124s module_fork_in_progress:0 124s module_fork_last_cow_size:0 124s 124s # Stats 124s total_connections_received:3 124s total_commands_processed:9 124s instantaneous_ops_per_sec:1 124s total_net_input_bytes:497 124s total_net_output_bytes:360 124s total_net_repl_input_bytes:0 124s total_net_repl_output_bytes:0 124s instantaneous_input_kbps:0.09 124s instantaneous_output_kbps:0.09 124s instantaneous_input_repl_kbps:0.00 124s instantaneous_output_repl_kbps:0.00 124s rejected_connections:0 124s sync_full:0 124s sync_partial_ok:0 124s sync_partial_err:0 124s expired_keys:0 124s expired_stale_perc:0.00 124s expired_time_cap_reached_count:0 124s expire_cycle_cpu_milliseconds:0 124s evicted_keys:0 124s evicted_clients:0 124s evicted_scripts:0 124s total_eviction_exceeded_time:0 124s current_eviction_exceeded_time:0 124s keyspace_hits:0 124s keyspace_misses:0 124s pubsub_channels:1 124s pubsub_patterns:0 124s pubsubshard_channels:0 124s latest_fork_usec:0 124s total_forks:0 124s migrate_cached_sockets:0 124s slave_expires_tracked_keys:0 124s active_defrag_hits:0 124s active_defrag_misses:0 124s active_defrag_key_hits:0 124s active_defrag_key_misses:0 124s total_active_defrag_time:0 124s current_active_defrag_time:0 124s tracking_total_keys:0 124s tracking_total_items:0 124s tracking_total_prefixes:0 124s unexpected_error_replies:0 124s total_error_replies:0 124s dump_payload_sanitizations:0 124s total_reads_processed:8 124s total_writes_processed:9 124s io_threaded_reads_processed:0 124s io_threaded_writes_processed:0 124s client_query_buffer_limit_disconnections:0 124s client_output_buffer_limit_disconnections:0 124s reply_buffer_shrinks:2 124s reply_buffer_expands:0 124s eventloop_cycles:58 124s eventloop_duration_sum:9207 124s eventloop_duration_cmd_sum:42 124s instantaneous_eventloop_cycles_per_sec:10 124s instantaneous_eventloop_duration_usec:156 124s acl_access_denied_auth:0 124s acl_access_denied_cmd:0 124s acl_access_denied_key:0 124s acl_access_denied_channel:0 124s 124s # Replication 124s role:master 124s connected_slaves:0 124s master_failover_state:no-failover 124s master_replid:d0329615b2e001d2aea2443832f1508774246573 124s master_replid2:0000000000000000000000000000000000000000 124s master_repl_offset:0 124s second_repl_offset:-1 124s repl_backlog_active:0 124s repl_backlog_size:1048576 124s repl_backlog_first_byte_offset:0 124s repl_backlog_histlen:0 124s 124s # CPU 124s used_cpu_sys:0.013851 124s used_cpu_user:0.036266 124s used_cpu_sys_children:0.000305 124s used_cpu_user_children:0.000048 124s used_cpu_sys_main_thread:0.013742 124s used_cpu_user_main_thread:0.036218 124s 124s # Modules 124s 124s # Errorstats 124s 124s # Cluster 124s cluster_enabled:0 124s 124s # Keyspace 124s Redict ver. 7.3.5 124s autopkgtest [17:19:51]: test 0001-redict-cli: -----------------------] 124s autopkgtest [17:19:51]: test 0001-redict-cli: - - - - - - - - - - results - - - - - - - - - - 124s 0001-redict-cli PASS 125s autopkgtest [17:19:52]: test 0002-benchmark: preparing testbed 125s Reading package lists... 125s Building dependency tree... 125s Reading state information... 125s Solving dependencies... 125s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 127s autopkgtest [17:19:54]: test 0002-benchmark: [----------------------- 132s PING_INLINE: rps=0.0 (overall: 500000.0) avg_msec=0.855 (overall: 0.855) ====== PING_INLINE ====== 132s 100000 requests completed in 0.11 seconds 132s 50 parallel clients 132s 3 bytes payload 132s keep alive: 1 132s host configuration "save": 3600 1 300 100 60 10000 132s host configuration "appendonly": no 132s multi-thread: no 132s 132s Latency by percentile distribution: 132s 0.000% <= 0.103 milliseconds (cumulative count 120) 132s 50.000% <= 0.407 milliseconds (cumulative count 50570) 132s 75.000% <= 0.535 milliseconds (cumulative count 75400) 132s 87.500% <= 0.703 milliseconds (cumulative count 88150) 132s 93.750% <= 0.767 milliseconds (cumulative count 93890) 132s 96.875% <= 0.807 milliseconds (cumulative count 97200) 132s 98.438% <= 0.831 milliseconds (cumulative count 98800) 132s 99.219% <= 0.847 milliseconds (cumulative count 99340) 132s 99.609% <= 0.863 milliseconds (cumulative count 99660) 132s 99.805% <= 0.879 milliseconds (cumulative count 99810) 132s 99.902% <= 1.927 milliseconds (cumulative count 99920) 132s 99.951% <= 1.943 milliseconds (cumulative count 99960) 132s 99.976% <= 1.951 milliseconds (cumulative count 99980) 132s 99.988% <= 1.959 milliseconds (cumulative count 100000) 132s 100.000% <= 1.959 milliseconds (cumulative count 100000) 132s 132s Cumulative distribution of latencies: 132s 0.120% <= 0.103 milliseconds (cumulative count 120) 132s 2.410% <= 0.207 milliseconds (cumulative count 2410) 132s 17.350% <= 0.303 milliseconds (cumulative count 17350) 132s 50.570% <= 0.407 milliseconds (cumulative count 50570) 132s 72.750% <= 0.503 milliseconds (cumulative count 72750) 132s 79.130% <= 0.607 milliseconds (cumulative count 79130) 132s 88.150% <= 0.703 milliseconds (cumulative count 88150) 132s 97.200% <= 0.807 milliseconds (cumulative count 97200) 132s 99.880% <= 0.903 milliseconds (cumulative count 99880) 132s 100.000% <= 2.007 milliseconds (cumulative count 100000) 132s 132s Summary: 132s throughput summary: 943396.25 requests per second 132s latency summary (msec): 132s avg min p50 p95 p99 max 132s 0.448 0.096 0.407 0.783 0.839 1.959 132s ====== PING_MBULK ====== 132s 100000 requests completed in 0.10 seconds 132s 50 parallel clients 132s 3 bytes payload 132s keep alive: 1 132s host configuration "save": 3600 1 300 100 60 10000 132s host configuration "appendonly": no 132s multi-thread: no 132s 132s Latency by percentile distribution: 132s 0.000% <= 0.103 milliseconds (cumulative count 850) 132s 50.000% <= 0.399 milliseconds (cumulative count 51200) 132s 75.000% <= 0.471 milliseconds (cumulative count 76490) 132s 87.500% <= 0.503 milliseconds (cumulative count 87880) 132s 93.750% <= 0.543 milliseconds (cumulative count 93850) 132s 96.875% <= 0.591 milliseconds (cumulative count 96990) 132s 98.438% <= 0.671 milliseconds (cumulative count 98530) 132s 99.219% <= 0.735 milliseconds (cumulative count 99230) 132s 99.609% <= 5.647 milliseconds (cumulative count 99620) 132s 99.805% <= 5.719 milliseconds (cumulative count 99820) 132s 99.902% <= 5.751 milliseconds (cumulative count 99910) 132s 99.951% <= 5.775 milliseconds (cumulative count 99970) 132s 99.976% <= 5.783 milliseconds (cumulative count 99990) 132s 99.994% <= 5.791 milliseconds (cumulative count 100000) 132s 100.000% <= 5.791 milliseconds (cumulative count 100000) 132s 132s Cumulative distribution of latencies: 132s 0.850% <= 0.103 milliseconds (cumulative count 850) 132s 14.340% <= 0.207 milliseconds (cumulative count 14340) 132s 37.450% <= 0.303 milliseconds (cumulative count 37450) 132s 53.730% <= 0.407 milliseconds (cumulative count 53730) 132s 87.880% <= 0.503 milliseconds (cumulative count 87880) 132s 97.350% <= 0.607 milliseconds (cumulative count 97350) 132s 98.970% <= 0.703 milliseconds (cumulative count 98970) 132s 99.480% <= 0.807 milliseconds (cumulative count 99480) 132s 99.510% <= 0.903 milliseconds (cumulative count 99510) 132s 100.000% <= 6.103 milliseconds (cumulative count 100000) 132s 132s Summary: 132s throughput summary: 1020408.19 requests per second 132s latency summary (msec): 132s avg min p50 p95 p99 max 132s 0.390 0.096 0.399 0.559 0.711 5.791 133s SET: rps=143480.0 (overall: 834186.0) avg_msec=0.472 (overall: 0.472) ====== SET ====== 133s 100000 requests completed in 0.13 seconds 133s 50 parallel clients 133s 3 bytes payload 133s keep alive: 1 133s host configuration "save": 3600 1 300 100 60 10000 133s host configuration "appendonly": no 133s multi-thread: no 133s 133s Latency by percentile distribution: 133s 0.000% <= 0.103 milliseconds (cumulative count 200) 133s 50.000% <= 0.479 milliseconds (cumulative count 50430) 133s 75.000% <= 0.607 milliseconds (cumulative count 75540) 133s 87.500% <= 0.791 milliseconds (cumulative count 87700) 133s 93.750% <= 0.855 milliseconds (cumulative count 94360) 133s 96.875% <= 0.911 milliseconds (cumulative count 96930) 133s 98.438% <= 0.983 milliseconds (cumulative count 98450) 133s 99.219% <= 1.047 milliseconds (cumulative count 99260) 133s 99.609% <= 1.111 milliseconds (cumulative count 99640) 133s 99.805% <= 1.143 milliseconds (cumulative count 99830) 133s 99.902% <= 1.175 milliseconds (cumulative count 99910) 133s 99.951% <= 6.815 milliseconds (cumulative count 99970) 133s 99.976% <= 6.823 milliseconds (cumulative count 99980) 133s 99.988% <= 6.831 milliseconds (cumulative count 100000) 133s 100.000% <= 6.831 milliseconds (cumulative count 100000) 133s 133s Cumulative distribution of latencies: 133s 0.200% <= 0.103 milliseconds (cumulative count 200) 133s 3.600% <= 0.207 milliseconds (cumulative count 3600) 133s 10.350% <= 0.303 milliseconds (cumulative count 10350) 133s 30.230% <= 0.407 milliseconds (cumulative count 30230) 133s 56.720% <= 0.503 milliseconds (cumulative count 56720) 133s 75.540% <= 0.607 milliseconds (cumulative count 75540) 133s 82.980% <= 0.703 milliseconds (cumulative count 82980) 133s 89.180% <= 0.807 milliseconds (cumulative count 89180) 133s 96.730% <= 0.903 milliseconds (cumulative count 96730) 133s 98.890% <= 1.007 milliseconds (cumulative count 98890) 133s 99.590% <= 1.103 milliseconds (cumulative count 99590) 133s 99.910% <= 1.207 milliseconds (cumulative count 99910) 133s 100.000% <= 7.103 milliseconds (cumulative count 100000) 133s 133s Summary: 133s throughput summary: 781249.94 requests per second 133s latency summary (msec): 133s avg min p50 p95 p99 max 133s 0.516 0.096 0.479 0.863 1.023 6.831 133s ====== GET ====== 133s 100000 requests completed in 0.12 seconds 133s 50 parallel clients 133s 3 bytes payload 133s keep alive: 1 133s host configuration "save": 3600 1 300 100 60 10000 133s host configuration "appendonly": no 133s multi-thread: no 133s 133s Latency by percentile distribution: 133s 0.000% <= 0.103 milliseconds (cumulative count 100) 133s 50.000% <= 0.423 milliseconds (cumulative count 51660) 133s 75.000% <= 0.543 milliseconds (cumulative count 75300) 133s 87.500% <= 0.727 milliseconds (cumulative count 88150) 133s 93.750% <= 0.791 milliseconds (cumulative count 94460) 133s 96.875% <= 0.831 milliseconds (cumulative count 97290) 133s 98.438% <= 0.871 milliseconds (cumulative count 98490) 133s 99.219% <= 0.983 milliseconds (cumulative count 99250) 133s 99.609% <= 5.551 milliseconds (cumulative count 99620) 133s 99.805% <= 5.631 milliseconds (cumulative count 99810) 133s 99.902% <= 5.679 milliseconds (cumulative count 99920) 133s 99.951% <= 5.695 milliseconds (cumulative count 99960) 133s 99.976% <= 5.711 milliseconds (cumulative count 99990) 133s 99.994% <= 5.719 milliseconds (cumulative count 100000) 133s 100.000% <= 5.719 milliseconds (cumulative count 100000) 133s 133s Cumulative distribution of latencies: 133s 0.100% <= 0.103 milliseconds (cumulative count 100) 133s 3.300% <= 0.207 milliseconds (cumulative count 3300) 133s 17.080% <= 0.303 milliseconds (cumulative count 17080) 133s 47.160% <= 0.407 milliseconds (cumulative count 47160) 133s 70.790% <= 0.503 milliseconds (cumulative count 70790) 133s 78.890% <= 0.607 milliseconds (cumulative count 78890) 133s 86.130% <= 0.703 milliseconds (cumulative count 86130) 133s 95.780% <= 0.807 milliseconds (cumulative count 95780) 133s 98.920% <= 0.903 milliseconds (cumulative count 98920) 133s 99.370% <= 1.007 milliseconds (cumulative count 99370) 133s 99.510% <= 1.103 milliseconds (cumulative count 99510) 133s 100.000% <= 6.103 milliseconds (cumulative count 100000) 133s 133s Summary: 133s throughput summary: 869565.19 requests per second 133s latency summary (msec): 133s avg min p50 p95 p99 max 133s 0.480 0.096 0.423 0.799 0.911 5.719 133s INCR: rps=309360.0 (overall: 1645532.0) avg_msec=0.260 (overall: 0.260) ====== INCR ====== 133s 100000 requests completed in 0.06 seconds 133s 50 parallel clients 133s 3 bytes payload 133s keep alive: 1 133s host configuration "save": 3600 1 300 100 60 10000 133s host configuration "appendonly": no 133s multi-thread: no 133s 133s Latency by percentile distribution: 133s 0.000% <= 0.095 milliseconds (cumulative count 10) 133s 50.000% <= 0.255 milliseconds (cumulative count 52480) 133s 75.000% <= 0.295 milliseconds (cumulative count 75520) 133s 87.500% <= 0.335 milliseconds (cumulative count 89540) 133s 93.750% <= 0.351 milliseconds (cumulative count 93990) 133s 96.875% <= 0.375 milliseconds (cumulative count 96900) 133s 98.438% <= 0.423 milliseconds (cumulative count 98600) 133s 99.219% <= 0.471 milliseconds (cumulative count 99310) 133s 99.609% <= 0.503 milliseconds (cumulative count 99650) 133s 99.805% <= 0.535 milliseconds (cumulative count 99810) 133s 99.902% <= 0.583 milliseconds (cumulative count 99910) 133s 99.951% <= 0.647 milliseconds (cumulative count 99970) 133s 99.976% <= 0.655 milliseconds (cumulative count 99980) 133s 99.988% <= 0.663 milliseconds (cumulative count 99990) 133s 99.994% <= 0.679 milliseconds (cumulative count 100000) 133s 100.000% <= 0.679 milliseconds (cumulative count 100000) 133s 133s Cumulative distribution of latencies: 133s 0.040% <= 0.103 milliseconds (cumulative count 40) 133s 16.570% <= 0.207 milliseconds (cumulative count 16570) 133s 78.730% <= 0.303 milliseconds (cumulative count 78730) 133s 98.240% <= 0.407 milliseconds (cumulative count 98240) 133s 99.650% <= 0.503 milliseconds (cumulative count 99650) 133s 99.930% <= 0.607 milliseconds (cumulative count 99930) 133s 100.000% <= 0.703 milliseconds (cumulative count 100000) 133s 133s Summary: 133s throughput summary: 1639344.25 requests per second 133s latency summary (msec): 133s avg min p50 p95 p99 max 133s 0.261 0.088 0.255 0.359 0.447 0.679 133s ====== LPUSH ====== 133s 100000 requests completed in 0.08 seconds 133s 50 parallel clients 133s 3 bytes payload 133s keep alive: 1 133s host configuration "save": 3600 1 300 100 60 10000 133s host configuration "appendonly": no 133s multi-thread: no 133s 133s Latency by percentile distribution: 133s 0.000% <= 0.087 milliseconds (cumulative count 10) 133s 50.000% <= 0.343 milliseconds (cumulative count 50210) 133s 75.000% <= 0.391 milliseconds (cumulative count 75740) 133s 87.500% <= 0.423 milliseconds (cumulative count 87540) 133s 93.750% <= 0.455 milliseconds (cumulative count 94860) 133s 96.875% <= 0.479 milliseconds (cumulative count 97490) 133s 98.438% <= 0.503 milliseconds (cumulative count 98570) 133s 99.219% <= 0.543 milliseconds (cumulative count 99300) 133s 99.609% <= 0.791 milliseconds (cumulative count 99620) 133s 99.805% <= 0.895 milliseconds (cumulative count 99820) 133s 99.902% <= 0.927 milliseconds (cumulative count 99910) 133s 99.951% <= 0.959 milliseconds (cumulative count 99960) 133s 99.976% <= 0.983 milliseconds (cumulative count 99980) 133s 99.988% <= 0.991 milliseconds (cumulative count 99990) 133s 99.994% <= 1.007 milliseconds (cumulative count 100000) 133s 100.000% <= 1.007 milliseconds (cumulative count 100000) 133s 133s Cumulative distribution of latencies: 133s 0.040% <= 0.103 milliseconds (cumulative count 40) 133s 1.700% <= 0.207 milliseconds (cumulative count 1700) 133s 26.680% <= 0.303 milliseconds (cumulative count 26680) 133s 81.950% <= 0.407 milliseconds (cumulative count 81950) 133s 98.570% <= 0.503 milliseconds (cumulative count 98570) 133s 99.430% <= 0.607 milliseconds (cumulative count 99430) 133s 99.660% <= 0.807 milliseconds (cumulative count 99660) 133s 99.850% <= 0.903 milliseconds (cumulative count 99850) 133s 100.000% <= 1.007 milliseconds (cumulative count 100000) 133s 133s Summary: 133s throughput summary: 1265822.75 requests per second 133s latency summary (msec): 133s avg min p50 p95 p99 max 133s 0.348 0.080 0.343 0.463 0.527 1.007 133s ====== RPUSH ====== 133s 100000 requests completed in 0.07 seconds 133s 50 parallel clients 133s 3 bytes payload 133s keep alive: 1 133s host configuration "save": 3600 1 300 100 60 10000 133s host configuration "appendonly": no 133s multi-thread: no 133s 133s Latency by percentile distribution: 133s 0.000% <= 0.095 milliseconds (cumulative count 20) 133s 50.000% <= 0.303 milliseconds (cumulative count 53040) 133s 75.000% <= 0.351 milliseconds (cumulative count 76690) 133s 87.500% <= 0.383 milliseconds (cumulative count 87760) 133s 93.750% <= 0.415 milliseconds (cumulative count 93810) 133s 96.875% <= 0.455 milliseconds (cumulative count 97380) 133s 98.438% <= 0.487 milliseconds (cumulative count 98650) 133s 99.219% <= 0.535 milliseconds (cumulative count 99250) 133s 99.609% <= 0.591 milliseconds (cumulative count 99620) 133s 99.805% <= 0.639 milliseconds (cumulative count 99820) 133s 99.902% <= 0.671 milliseconds (cumulative count 99920) 133s 99.951% <= 0.695 milliseconds (cumulative count 99970) 133s 99.976% <= 0.711 milliseconds (cumulative count 99980) 133s 99.988% <= 0.719 milliseconds (cumulative count 99990) 133s 99.994% <= 0.743 milliseconds (cumulative count 100000) 133s 100.000% <= 0.743 milliseconds (cumulative count 100000) 133s 133s Cumulative distribution of latencies: 133s 0.040% <= 0.103 milliseconds (cumulative count 40) 133s 5.500% <= 0.207 milliseconds (cumulative count 5500) 133s 53.040% <= 0.303 milliseconds (cumulative count 53040) 133s 92.820% <= 0.407 milliseconds (cumulative count 92820) 133s 98.880% <= 0.503 milliseconds (cumulative count 98880) 133s 99.700% <= 0.607 milliseconds (cumulative count 99700) 133s 99.970% <= 0.703 milliseconds (cumulative count 99970) 133s 100.000% <= 0.807 milliseconds (cumulative count 100000) 133s 133s Summary: 133s throughput summary: 1388889.00 requests per second 133s latency summary (msec): 133s avg min p50 p95 p99 max 133s 0.306 0.088 0.303 0.431 0.519 0.743 133s ====== LPOP ====== 133s 100000 requests completed in 0.08 seconds 133s 50 parallel clients 133s 3 bytes payload 133s keep alive: 1 133s host configuration "save": 3600 1 300 100 60 10000 133s host configuration "appendonly": no 133s multi-thread: no 133s 133s Latency by percentile distribution: 133s 0.000% <= 0.095 milliseconds (cumulative count 30) 133s 50.000% <= 0.359 milliseconds (cumulative count 50340) 133s 75.000% <= 0.407 milliseconds (cumulative count 77020) 133s 87.500% <= 0.439 milliseconds (cumulative count 88860) 133s 93.750% <= 0.463 milliseconds (cumulative count 94520) 133s 96.875% <= 0.487 milliseconds (cumulative count 97450) 133s 98.438% <= 0.511 milliseconds (cumulative count 98450) 133s 99.219% <= 0.623 milliseconds (cumulative count 99220) 133s 99.609% <= 0.695 milliseconds (cumulative count 99610) 133s 99.805% <= 0.775 milliseconds (cumulative count 99830) 133s 99.902% <= 0.799 milliseconds (cumulative count 99910) 133s 99.951% <= 0.831 milliseconds (cumulative count 99960) 133s 99.976% <= 0.847 milliseconds (cumulative count 99980) 133s 99.988% <= 0.855 milliseconds (cumulative count 99990) 133s 99.994% <= 0.863 milliseconds (cumulative count 100000) 133s 100.000% <= 0.863 milliseconds (cumulative count 100000) 133s 133s Cumulative distribution of latencies: 133s 0.070% <= 0.103 milliseconds (cumulative count 70) 133s 0.720% <= 0.207 milliseconds (cumulative count 720) 133s 18.930% <= 0.303 milliseconds (cumulative count 18930) 133s 77.020% <= 0.407 milliseconds (cumulative count 77020) 133s 98.310% <= 0.503 milliseconds (cumulative count 98310) 133s 99.120% <= 0.607 milliseconds (cumulative count 99120) 133s 99.630% <= 0.703 milliseconds (cumulative count 99630) 133s 99.940% <= 0.807 milliseconds (cumulative count 99940) 133s 100.000% <= 0.903 milliseconds (cumulative count 100000) 133s 133s Summary: 133s throughput summary: 1219512.12 requests per second 133s latency summary (msec): 133s avg min p50 p95 p99 max 133s 0.362 0.088 0.359 0.471 0.591 0.863 133s RPOP: rps=3680.0 (overall: inf) avg_msec=0.309 (overall: 0.309) ====== RPOP ====== 133s 100000 requests completed in 0.08 seconds 133s 50 parallel clients 133s 3 bytes payload 133s keep alive: 1 133s host configuration "save": 3600 1 300 100 60 10000 133s host configuration "appendonly": no 133s multi-thread: no 133s 133s Latency by percentile distribution: 133s 0.000% <= 0.103 milliseconds (cumulative count 10) 133s 50.000% <= 0.367 milliseconds (cumulative count 53000) 133s 75.000% <= 0.423 milliseconds (cumulative count 76070) 133s 87.500% <= 0.471 milliseconds (cumulative count 88090) 133s 93.750% <= 0.511 milliseconds (cumulative count 94690) 133s 96.875% <= 0.535 milliseconds (cumulative count 97200) 133s 98.438% <= 0.567 milliseconds (cumulative count 98690) 133s 99.219% <= 0.599 milliseconds (cumulative count 99300) 133s 99.609% <= 0.647 milliseconds (cumulative count 99630) 133s 99.805% <= 0.703 milliseconds (cumulative count 99810) 133s 99.902% <= 0.759 milliseconds (cumulative count 99910) 133s 99.951% <= 0.799 milliseconds (cumulative count 99960) 133s 99.976% <= 0.823 milliseconds (cumulative count 99980) 133s 99.988% <= 0.831 milliseconds (cumulative count 99990) 133s 99.994% <= 0.863 milliseconds (cumulative count 100000) 133s 100.000% <= 0.863 milliseconds (cumulative count 100000) 133s 133s Cumulative distribution of latencies: 133s 0.010% <= 0.103 milliseconds (cumulative count 10) 133s 1.140% <= 0.207 milliseconds (cumulative count 1140) 133s 20.200% <= 0.303 milliseconds (cumulative count 20200) 133s 69.950% <= 0.407 milliseconds (cumulative count 69950) 133s 93.500% <= 0.503 milliseconds (cumulative count 93500) 133s 99.370% <= 0.607 milliseconds (cumulative count 99370) 133s 99.810% <= 0.703 milliseconds (cumulative count 99810) 133s 99.970% <= 0.807 milliseconds (cumulative count 99970) 133s 100.000% <= 0.903 milliseconds (cumulative count 100000) 133s 133s Summary: 133s throughput summary: 1190476.25 requests per second 133s latency summary (msec): 133s avg min p50 p95 p99 max 133s 0.371 0.096 0.367 0.519 0.583 0.863 133s ====== SADD ====== 133s 100000 requests completed in 0.08 seconds 133s 50 parallel clients 133s 3 bytes payload 133s keep alive: 1 133s host configuration "save": 3600 1 300 100 60 10000 133s host configuration "appendonly": no 133s multi-thread: no 133s 133s Latency by percentile distribution: 133s 0.000% <= 0.119 milliseconds (cumulative count 20) 133s 50.000% <= 0.359 milliseconds (cumulative count 51820) 133s 75.000% <= 0.415 milliseconds (cumulative count 75410) 133s 87.500% <= 0.463 milliseconds (cumulative count 89140) 133s 93.750% <= 0.487 milliseconds (cumulative count 94610) 133s 96.875% <= 0.511 milliseconds (cumulative count 97090) 133s 98.438% <= 0.543 milliseconds (cumulative count 98610) 133s 99.219% <= 0.583 milliseconds (cumulative count 99220) 133s 99.609% <= 0.639 milliseconds (cumulative count 99610) 133s 99.805% <= 0.671 milliseconds (cumulative count 99810) 133s 99.902% <= 0.711 milliseconds (cumulative count 99910) 133s 99.951% <= 0.743 milliseconds (cumulative count 99960) 133s 99.976% <= 0.751 milliseconds (cumulative count 99980) 133s 99.988% <= 0.759 milliseconds (cumulative count 99990) 133s 99.994% <= 0.783 milliseconds (cumulative count 100000) 133s 100.000% <= 0.783 milliseconds (cumulative count 100000) 133s 133s Cumulative distribution of latencies: 133s 0.000% <= 0.103 milliseconds (cumulative count 0) 133s 1.950% <= 0.207 milliseconds (cumulative count 1950) 133s 20.310% <= 0.303 milliseconds (cumulative count 20310) 133s 72.800% <= 0.407 milliseconds (cumulative count 72800) 133s 96.540% <= 0.503 milliseconds (cumulative count 96540) 133s 99.410% <= 0.607 milliseconds (cumulative count 99410) 133s 99.900% <= 0.703 milliseconds (cumulative count 99900) 133s 100.000% <= 0.807 milliseconds (cumulative count 100000) 133s 133s Summary: 133s throughput summary: 1190476.25 requests per second 133s latency summary (msec): 133s avg min p50 p95 p99 max 133s 0.365 0.112 0.359 0.495 0.567 0.783 133s HSET: rps=346255.0 (overall: 1072963.0) avg_msec=0.410 (overall: 0.410) ====== HSET ====== 133s 100000 requests completed in 0.09 seconds 133s 50 parallel clients 133s 3 bytes payload 133s keep alive: 1 133s host configuration "save": 3600 1 300 100 60 10000 133s host configuration "appendonly": no 133s multi-thread: no 133s 133s Latency by percentile distribution: 133s 0.000% <= 0.135 milliseconds (cumulative count 10) 133s 50.000% <= 0.407 milliseconds (cumulative count 51940) 133s 75.000% <= 0.471 milliseconds (cumulative count 77230) 133s 87.500% <= 0.511 milliseconds (cumulative count 88250) 133s 93.750% <= 0.543 milliseconds (cumulative count 94610) 133s 96.875% <= 0.575 milliseconds (cumulative count 97270) 133s 98.438% <= 0.615 milliseconds (cumulative count 98440) 133s 99.219% <= 0.727 milliseconds (cumulative count 99220) 133s 99.609% <= 0.903 milliseconds (cumulative count 99630) 133s 99.805% <= 0.959 milliseconds (cumulative count 99830) 133s 99.902% <= 1.007 milliseconds (cumulative count 99910) 133s 99.951% <= 1.055 milliseconds (cumulative count 99960) 133s 99.976% <= 1.087 milliseconds (cumulative count 99980) 133s 99.988% <= 1.095 milliseconds (cumulative count 99990) 133s 99.994% <= 1.103 milliseconds (cumulative count 100000) 133s 100.000% <= 1.103 milliseconds (cumulative count 100000) 133s 133s Cumulative distribution of latencies: 133s 0.000% <= 0.103 milliseconds (cumulative count 0) 133s 1.320% <= 0.207 milliseconds (cumulative count 1320) 133s 9.690% <= 0.303 milliseconds (cumulative count 9690) 133s 51.940% <= 0.407 milliseconds (cumulative count 51940) 133s 86.340% <= 0.503 milliseconds (cumulative count 86340) 133s 98.340% <= 0.607 milliseconds (cumulative count 98340) 133s 99.130% <= 0.703 milliseconds (cumulative count 99130) 133s 99.350% <= 0.807 milliseconds (cumulative count 99350) 133s 99.630% <= 0.903 milliseconds (cumulative count 99630) 133s 99.910% <= 1.007 milliseconds (cumulative count 99910) 133s 100.000% <= 1.103 milliseconds (cumulative count 100000) 133s 133s Summary: 133s throughput summary: 1075268.75 requests per second 133s latency summary (msec): 133s avg min p50 p95 p99 max 133s 0.410 0.128 0.407 0.551 0.679 1.103 133s ====== SPOP ====== 133s 100000 requests completed in 0.07 seconds 133s 50 parallel clients 133s 3 bytes payload 133s keep alive: 1 133s host configuration "save": 3600 1 300 100 60 10000 133s host configuration "appendonly": no 133s multi-thread: no 133s 133s Latency by percentile distribution: 133s 0.000% <= 0.095 milliseconds (cumulative count 30) 133s 50.000% <= 0.279 milliseconds (cumulative count 52490) 133s 75.000% <= 0.335 milliseconds (cumulative count 76850) 133s 87.500% <= 0.375 milliseconds (cumulative count 87580) 133s 93.750% <= 0.415 milliseconds (cumulative count 94420) 133s 96.875% <= 0.447 milliseconds (cumulative count 97210) 133s 98.438% <= 0.479 milliseconds (cumulative count 98500) 133s 99.219% <= 0.511 milliseconds (cumulative count 99270) 133s 99.609% <= 0.551 milliseconds (cumulative count 99650) 133s 99.805% <= 0.591 milliseconds (cumulative count 99860) 133s 99.902% <= 0.607 milliseconds (cumulative count 99910) 133s 99.951% <= 0.655 milliseconds (cumulative count 99960) 133s 99.976% <= 0.679 milliseconds (cumulative count 99980) 133s 99.988% <= 0.711 milliseconds (cumulative count 99990) 133s 99.994% <= 0.767 milliseconds (cumulative count 100000) 133s 100.000% <= 0.767 milliseconds (cumulative count 100000) 133s 133s Cumulative distribution of latencies: 133s 0.080% <= 0.103 milliseconds (cumulative count 80) 133s 15.580% <= 0.207 milliseconds (cumulative count 15580) 133s 64.310% <= 0.303 milliseconds (cumulative count 64310) 133s 93.480% <= 0.407 milliseconds (cumulative count 93480) 133s 99.130% <= 0.503 milliseconds (cumulative count 99130) 133s 99.910% <= 0.607 milliseconds (cumulative count 99910) 133s 99.980% <= 0.703 milliseconds (cumulative count 99980) 133s 100.000% <= 0.807 milliseconds (cumulative count 100000) 133s 133s Summary: 133s throughput summary: 1470588.12 requests per second 133s latency summary (msec): 133s avg min p50 p95 p99 max 133s 0.284 0.088 0.279 0.423 0.503 0.767 133s ====== ZADD ====== 133s 100000 requests completed in 0.10 seconds 133s 50 parallel clients 133s 3 bytes payload 133s keep alive: 1 133s host configuration "save": 3600 1 300 100 60 10000 133s host configuration "appendonly": no 133s multi-thread: no 133s 133s Latency by percentile distribution: 133s 0.000% <= 0.111 milliseconds (cumulative count 30) 133s 50.000% <= 0.407 milliseconds (cumulative count 51180) 133s 75.000% <= 0.487 milliseconds (cumulative count 76350) 133s 87.500% <= 0.551 milliseconds (cumulative count 88150) 133s 93.750% <= 0.647 milliseconds (cumulative count 94070) 133s 96.875% <= 0.783 milliseconds (cumulative count 96990) 133s 98.438% <= 0.951 milliseconds (cumulative count 98450) 133s 99.219% <= 1.191 milliseconds (cumulative count 99220) 133s 99.609% <= 1.959 milliseconds (cumulative count 99640) 133s 99.805% <= 1.991 milliseconds (cumulative count 99810) 133s 99.902% <= 2.063 milliseconds (cumulative count 99910) 133s 99.951% <= 2.111 milliseconds (cumulative count 99960) 133s 99.976% <= 2.407 milliseconds (cumulative count 99980) 133s 99.988% <= 2.543 milliseconds (cumulative count 99990) 133s 99.994% <= 2.551 milliseconds (cumulative count 100000) 133s 100.000% <= 2.551 milliseconds (cumulative count 100000) 133s 133s Cumulative distribution of latencies: 133s 0.000% <= 0.103 milliseconds (cumulative count 0) 133s 1.320% <= 0.207 milliseconds (cumulative count 1320) 133s 12.490% <= 0.303 milliseconds (cumulative count 12490) 133s 51.180% <= 0.407 milliseconds (cumulative count 51180) 133s 79.780% <= 0.503 milliseconds (cumulative count 79780) 133s 92.530% <= 0.607 milliseconds (cumulative count 92530) 133s 95.680% <= 0.703 milliseconds (cumulative count 95680) 133s 97.240% <= 0.807 milliseconds (cumulative count 97240) 133s 98.070% <= 0.903 milliseconds (cumulative count 98070) 133s 98.800% <= 1.007 milliseconds (cumulative count 98800) 133s 99.010% <= 1.103 milliseconds (cumulative count 99010) 133s 99.260% <= 1.207 milliseconds (cumulative count 99260) 133s 99.450% <= 1.303 milliseconds (cumulative count 99450) 133s 99.500% <= 1.407 milliseconds (cumulative count 99500) 133s 99.520% <= 1.607 milliseconds (cumulative count 99520) 133s 99.530% <= 1.703 milliseconds (cumulative count 99530) 133s 99.540% <= 1.807 milliseconds (cumulative count 99540) 133s 99.840% <= 2.007 milliseconds (cumulative count 99840) 133s 99.950% <= 2.103 milliseconds (cumulative count 99950) 133s 100.000% <= 3.103 milliseconds (cumulative count 100000) 133s 133s Summary: 133s throughput summary: 1020408.19 requests per second 133s latency summary (msec): 133s avg min p50 p95 p99 max 133s 0.434 0.104 0.407 0.679 1.103 2.551 133s ZPOPMIN: rps=391480.0 (overall: 1418405.9) avg_msec=0.301 (overall: 0.301) ====== ZPOPMIN ====== 133s 100000 requests completed in 0.07 seconds 133s 50 parallel clients 133s 3 bytes payload 133s keep alive: 1 133s host configuration "save": 3600 1 300 100 60 10000 133s host configuration "appendonly": no 133s multi-thread: no 133s 133s Latency by percentile distribution: 133s 0.000% <= 0.095 milliseconds (cumulative count 20) 133s 50.000% <= 0.295 milliseconds (cumulative count 53530) 133s 75.000% <= 0.351 milliseconds (cumulative count 76280) 133s 87.500% <= 0.407 milliseconds (cumulative count 87970) 133s 93.750% <= 0.455 milliseconds (cumulative count 94390) 133s 96.875% <= 0.503 milliseconds (cumulative count 97170) 133s 98.438% <= 0.559 milliseconds (cumulative count 98520) 133s 99.219% <= 0.631 milliseconds (cumulative count 99250) 133s 99.609% <= 0.687 milliseconds (cumulative count 99620) 133s 99.805% <= 0.719 milliseconds (cumulative count 99830) 133s 99.902% <= 0.759 milliseconds (cumulative count 99910) 133s 99.951% <= 0.799 milliseconds (cumulative count 99960) 133s 99.976% <= 0.831 milliseconds (cumulative count 99980) 133s 99.988% <= 0.847 milliseconds (cumulative count 99990) 133s 99.994% <= 0.927 milliseconds (cumulative count 100000) 133s 100.000% <= 0.927 milliseconds (cumulative count 100000) 133s 133s Cumulative distribution of latencies: 133s 0.090% <= 0.103 milliseconds (cumulative count 90) 133s 13.910% <= 0.207 milliseconds (cumulative count 13910) 133s 57.660% <= 0.303 milliseconds (cumulative count 57660) 133s 87.970% <= 0.407 milliseconds (cumulative count 87970) 133s 97.170% <= 0.503 milliseconds (cumulative count 97170) 133s 99.010% <= 0.607 milliseconds (cumulative count 99010) 133s 99.760% <= 0.703 milliseconds (cumulative count 99760) 133s 99.970% <= 0.807 milliseconds (cumulative count 99970) 133s 99.990% <= 0.903 milliseconds (cumulative count 99990) 133s 100.000% <= 1.007 milliseconds (cumulative count 100000) 133s 133s Summary: 133s throughput summary: 1408450.62 requests per second 133s latency summary (msec): 133s avg min p50 p95 p99 max 133s 0.300 0.088 0.295 0.471 0.607 0.927 133s ====== LPUSH (needed to benchmark LRANGE) ====== 133s 100000 requests completed in 0.09 seconds 133s 50 parallel clients 133s 3 bytes payload 133s keep alive: 1 133s host configuration "save": 3600 1 300 100 60 10000 133s host configuration "appendonly": no 133s multi-thread: no 133s 133s Latency by percentile distribution: 133s 0.000% <= 0.119 milliseconds (cumulative count 10) 133s 50.000% <= 0.407 milliseconds (cumulative count 52920) 133s 75.000% <= 0.471 milliseconds (cumulative count 76670) 133s 87.500% <= 0.519 milliseconds (cumulative count 88520) 133s 93.750% <= 0.559 milliseconds (cumulative count 94450) 133s 96.875% <= 0.599 milliseconds (cumulative count 97230) 133s 98.438% <= 0.663 milliseconds (cumulative count 98490) 133s 99.219% <= 0.751 milliseconds (cumulative count 99240) 133s 99.609% <= 0.903 milliseconds (cumulative count 99640) 133s 99.805% <= 1.055 milliseconds (cumulative count 99810) 133s 99.902% <= 1.167 milliseconds (cumulative count 99910) 133s 99.951% <= 1.231 milliseconds (cumulative count 99960) 133s 99.976% <= 1.239 milliseconds (cumulative count 99980) 133s 99.988% <= 1.255 milliseconds (cumulative count 99990) 133s 99.994% <= 1.295 milliseconds (cumulative count 100000) 133s 100.000% <= 1.295 milliseconds (cumulative count 100000) 133s 133s Cumulative distribution of latencies: 133s 0.000% <= 0.103 milliseconds (cumulative count 0) 133s 1.180% <= 0.207 milliseconds (cumulative count 1180) 133s 11.390% <= 0.303 milliseconds (cumulative count 11390) 133s 52.920% <= 0.407 milliseconds (cumulative count 52920) 133s 85.090% <= 0.503 milliseconds (cumulative count 85090) 133s 97.500% <= 0.607 milliseconds (cumulative count 97500) 133s 98.910% <= 0.703 milliseconds (cumulative count 98910) 133s 99.370% <= 0.807 milliseconds (cumulative count 99370) 133s 99.640% <= 0.903 milliseconds (cumulative count 99640) 133s 99.730% <= 1.007 milliseconds (cumulative count 99730) 133s 99.850% <= 1.103 milliseconds (cumulative count 99850) 133s 99.940% <= 1.207 milliseconds (cumulative count 99940) 133s 100.000% <= 1.303 milliseconds (cumulative count 100000) 133s 133s Summary: 133s throughput summary: 1063829.88 requests per second 133s latency summary (msec): 134s avg min p50 p95 p99 max 134s 0.411 0.112 0.407 0.567 0.719 1.295 135s LRANGE_100 (first 100 elements): rps=86533.9 (overall: 141039.0) avg_msec=2.533 (overall: 2.533) LRANGE_100 (first 100 elements): rps=129881.0 (overall: 134113.3) avg_msec=2.885 (overall: 2.745) LRANGE_100 (first 100 elements): rps=88725.1 (overall: 116773.2) avg_msec=4.499 (overall: 3.254) LRANGE_100 (first 100 elements): rps=75160.0 (overall: 105303.2) avg_msec=4.603 (overall: 3.519) ====== LRANGE_100 (first 100 elements) ====== 135s 100000 requests completed in 0.94 seconds 135s 50 parallel clients 135s 3 bytes payload 135s keep alive: 1 135s host configuration "save": 3600 1 300 100 60 10000 135s host configuration "appendonly": no 135s multi-thread: no 135s 135s Latency by percentile distribution: 135s 0.000% <= 0.423 milliseconds (cumulative count 10) 135s 50.000% <= 3.055 milliseconds (cumulative count 50050) 135s 75.000% <= 4.303 milliseconds (cumulative count 75000) 135s 87.500% <= 5.431 milliseconds (cumulative count 87580) 135s 93.750% <= 6.615 milliseconds (cumulative count 93770) 135s 96.875% <= 7.831 milliseconds (cumulative count 96880) 135s 98.438% <= 9.175 milliseconds (cumulative count 98440) 135s 99.219% <= 10.527 milliseconds (cumulative count 99220) 135s 99.609% <= 12.255 milliseconds (cumulative count 99610) 135s 99.805% <= 13.799 milliseconds (cumulative count 99810) 135s 99.902% <= 14.943 milliseconds (cumulative count 99910) 135s 99.951% <= 15.383 milliseconds (cumulative count 99960) 135s 99.976% <= 15.543 milliseconds (cumulative count 99980) 135s 99.988% <= 15.607 milliseconds (cumulative count 99990) 135s 99.994% <= 15.687 milliseconds (cumulative count 100000) 135s 100.000% <= 15.687 milliseconds (cumulative count 100000) 135s 135s Cumulative distribution of latencies: 135s 0.000% <= 0.103 milliseconds (cumulative count 0) 135s 0.040% <= 0.503 milliseconds (cumulative count 40) 135s 0.100% <= 0.607 milliseconds (cumulative count 100) 135s 0.170% <= 0.703 milliseconds (cumulative count 170) 135s 0.230% <= 0.807 milliseconds (cumulative count 230) 135s 0.260% <= 0.903 milliseconds (cumulative count 260) 135s 0.320% <= 1.007 milliseconds (cumulative count 320) 135s 0.460% <= 1.103 milliseconds (cumulative count 460) 135s 0.830% <= 1.207 milliseconds (cumulative count 830) 135s 1.480% <= 1.303 milliseconds (cumulative count 1480) 135s 2.660% <= 1.407 milliseconds (cumulative count 2660) 135s 3.890% <= 1.503 milliseconds (cumulative count 3890) 135s 5.900% <= 1.607 milliseconds (cumulative count 5900) 135s 8.210% <= 1.703 milliseconds (cumulative count 8210) 135s 10.970% <= 1.807 milliseconds (cumulative count 10970) 135s 13.770% <= 1.903 milliseconds (cumulative count 13770) 135s 17.040% <= 2.007 milliseconds (cumulative count 17040) 135s 20.120% <= 2.103 milliseconds (cumulative count 20120) 135s 51.260% <= 3.103 milliseconds (cumulative count 51260) 135s 71.770% <= 4.103 milliseconds (cumulative count 71770) 135s 84.600% <= 5.103 milliseconds (cumulative count 84600) 135s 91.700% <= 6.103 milliseconds (cumulative count 91700) 135s 95.230% <= 7.103 milliseconds (cumulative count 95230) 135s 97.280% <= 8.103 milliseconds (cumulative count 97280) 135s 98.360% <= 9.103 milliseconds (cumulative count 98360) 135s 98.990% <= 10.103 milliseconds (cumulative count 98990) 135s 99.450% <= 11.103 milliseconds (cumulative count 99450) 135s 99.590% <= 12.103 milliseconds (cumulative count 99590) 135s 99.740% <= 13.103 milliseconds (cumulative count 99740) 135s 99.820% <= 14.103 milliseconds (cumulative count 99820) 135s 99.920% <= 15.103 milliseconds (cumulative count 99920) 135s 100.000% <= 16.103 milliseconds (cumulative count 100000) 135s 135s Summary: 135s throughput summary: 105820.11 requests per second 135s latency summary (msec): 135s avg min p50 p95 p99 max 135s 3.522 0.416 3.055 7.015 10.183 15.687 138s LRANGE_300 (first 300 elements): rps=27290.8 (overall: 32311.3) avg_msec=9.733 (overall: 9.733) LRANGE_300 (first 300 elements): rps=31119.0 (overall: 31663.8) avg_msec=9.727 (overall: 9.730) LRANGE_300 (first 300 elements): rps=35450.2 (overall: 32993.0) avg_msec=8.252 (overall: 9.172) LRANGE_300 (first 300 elements): rps=27274.9 (overall: 31507.2) avg_msec=11.106 (overall: 9.607) LRANGE_300 (first 300 elements): rps=29698.4 (overall: 31133.0) avg_msec=9.823 (overall: 9.650) LRANGE_300 (first 300 elements): rps=30672.0 (overall: 31054.5) avg_msec=10.059 (overall: 9.719) LRANGE_300 (first 300 elements): rps=33770.8 (overall: 31453.8) avg_msec=8.274 (overall: 9.491) LRANGE_300 (first 300 elements): rps=31343.9 (overall: 31439.7) avg_msec=9.793 (overall: 9.529) LRANGE_300 (first 300 elements): rps=29373.0 (overall: 31205.8) avg_msec=10.587 (overall: 9.642) LRANGE_300 (first 300 elements): rps=28152.0 (overall: 30897.4) avg_msec=11.254 (overall: 9.790) LRANGE_300 (first 300 elements): rps=34563.5 (overall: 31236.1) avg_msec=8.198 (overall: 9.628) LRANGE_300 (first 300 elements): rps=41832.7 (overall: 32128.9) avg_msec=6.129 (overall: 9.244) ====== LRANGE_300 (first 300 elements) ====== 138s 100000 requests completed in 3.08 seconds 138s 50 parallel clients 138s 3 bytes payload 138s keep alive: 1 138s host configuration "save": 3600 1 300 100 60 10000 138s host configuration "appendonly": no 138s multi-thread: no 138s 138s Latency by percentile distribution: 138s 0.000% <= 0.303 milliseconds (cumulative count 10) 138s 50.000% <= 8.663 milliseconds (cumulative count 50060) 138s 75.000% <= 12.167 milliseconds (cumulative count 75020) 138s 87.500% <= 14.319 milliseconds (cumulative count 87510) 138s 93.750% <= 15.775 milliseconds (cumulative count 93760) 138s 96.875% <= 17.247 milliseconds (cumulative count 96880) 138s 98.438% <= 18.175 milliseconds (cumulative count 98450) 138s 99.219% <= 18.671 milliseconds (cumulative count 99220) 138s 99.609% <= 19.071 milliseconds (cumulative count 99610) 138s 99.805% <= 19.983 milliseconds (cumulative count 99810) 138s 99.902% <= 20.815 milliseconds (cumulative count 99910) 138s 99.951% <= 21.087 milliseconds (cumulative count 99960) 138s 99.976% <= 21.199 milliseconds (cumulative count 99980) 138s 99.988% <= 21.359 milliseconds (cumulative count 99990) 138s 99.994% <= 21.407 milliseconds (cumulative count 100000) 138s 100.000% <= 21.407 milliseconds (cumulative count 100000) 138s 138s Cumulative distribution of latencies: 138s 0.000% <= 0.103 milliseconds (cumulative count 0) 138s 0.010% <= 0.303 milliseconds (cumulative count 10) 138s 0.020% <= 0.503 milliseconds (cumulative count 20) 138s 0.110% <= 0.607 milliseconds (cumulative count 110) 138s 0.210% <= 0.703 milliseconds (cumulative count 210) 138s 0.490% <= 0.807 milliseconds (cumulative count 490) 138s 0.780% <= 0.903 milliseconds (cumulative count 780) 138s 1.210% <= 1.007 milliseconds (cumulative count 1210) 138s 1.490% <= 1.103 milliseconds (cumulative count 1490) 138s 2.020% <= 1.207 milliseconds (cumulative count 2020) 138s 2.350% <= 1.303 milliseconds (cumulative count 2350) 138s 2.750% <= 1.407 milliseconds (cumulative count 2750) 138s 3.090% <= 1.503 milliseconds (cumulative count 3090) 138s 3.440% <= 1.607 milliseconds (cumulative count 3440) 138s 3.720% <= 1.703 milliseconds (cumulative count 3720) 138s 3.940% <= 1.807 milliseconds (cumulative count 3940) 138s 4.160% <= 1.903 milliseconds (cumulative count 4160) 138s 4.420% <= 2.007 milliseconds (cumulative count 4420) 138s 4.540% <= 2.103 milliseconds (cumulative count 4540) 138s 5.910% <= 3.103 milliseconds (cumulative count 5910) 138s 7.980% <= 4.103 milliseconds (cumulative count 7980) 138s 15.050% <= 5.103 milliseconds (cumulative count 15050) 138s 25.130% <= 6.103 milliseconds (cumulative count 25130) 138s 37.060% <= 7.103 milliseconds (cumulative count 37060) 138s 45.900% <= 8.103 milliseconds (cumulative count 45900) 138s 53.600% <= 9.103 milliseconds (cumulative count 53600) 138s 60.850% <= 10.103 milliseconds (cumulative count 60850) 138s 67.890% <= 11.103 milliseconds (cumulative count 67890) 138s 74.660% <= 12.103 milliseconds (cumulative count 74660) 138s 80.910% <= 13.103 milliseconds (cumulative count 80910) 138s 86.480% <= 14.103 milliseconds (cumulative count 86480) 138s 91.030% <= 15.103 milliseconds (cumulative count 91030) 138s 94.750% <= 16.103 milliseconds (cumulative count 94750) 138s 96.600% <= 17.103 milliseconds (cumulative count 96600) 138s 98.330% <= 18.111 milliseconds (cumulative count 98330) 138s 99.620% <= 19.103 milliseconds (cumulative count 99620) 138s 99.830% <= 20.111 milliseconds (cumulative count 99830) 138s 99.960% <= 21.103 milliseconds (cumulative count 99960) 138s 100.000% <= 22.111 milliseconds (cumulative count 100000) 138s 138s Summary: 138s throughput summary: 32425.42 requests per second 138s latency summary (msec): 138s avg min p50 p95 p99 max 138s 9.129 0.296 8.663 16.199 18.495 21.407 143s LRANGE_500 (first 500 elements): rps=8228.3 (overall: 14121.6) avg_msec=19.594 (overall: 19.594) LRANGE_500 (first 500 elements): rps=17968.1 (overall: 16541.4) avg_msec=15.122 (overall: 16.538) LRANGE_500 (first 500 elements): rps=21384.3 (overall: 18429.7) avg_msec=12.765 (overall: 14.831) LRANGE_500 (first 500 elements): rps=21856.6 (overall: 19380.1) avg_msec=11.472 (overall: 13.780) LRANGE_500 (first 500 elements): rps=22482.1 (overall: 20053.6) avg_msec=11.059 (overall: 13.118) LRANGE_500 (first 500 elements): rps=23151.4 (overall: 20606.3) avg_msec=8.739 (overall: 12.240) LRANGE_500 (first 500 elements): rps=20426.3 (overall: 20579.0) avg_msec=11.144 (overall: 12.075) LRANGE_500 (first 500 elements): rps=14661.4 (overall: 19800.9) avg_msec=18.813 (overall: 12.731) LRANGE_500 (first 500 elements): rps=14661.4 (overall: 19203.7) avg_msec=18.794 (overall: 13.269) LRANGE_500 (first 500 elements): rps=14621.5 (overall: 18726.7) avg_msec=18.963 (overall: 13.732) LRANGE_500 (first 500 elements): rps=14980.1 (overall: 18373.4) avg_msec=18.671 (overall: 14.112) LRANGE_500 (first 500 elements): rps=14680.0 (overall: 18056.3) avg_msec=19.274 (overall: 14.472) LRANGE_500 (first 500 elements): rps=20541.5 (overall: 18255.0) avg_msec=13.110 (overall: 14.350) LRANGE_500 (first 500 elements): rps=15732.0 (overall: 18070.3) avg_msec=17.304 (overall: 14.538) LRANGE_500 (first 500 elements): rps=14780.9 (overall: 17845.1) avg_msec=19.015 (overall: 14.792) LRANGE_500 (first 500 elements): rps=14701.2 (overall: 17643.6) avg_msec=18.551 (overall: 14.992) LRANGE_500 (first 500 elements): rps=15179.3 (overall: 17495.2) avg_msec=18.680 (overall: 15.185) LRANGE_500 (first 500 elements): rps=22502.0 (overall: 17781.7) avg_msec=11.071 (overall: 14.887) LRANGE_500 (first 500 elements): rps=21822.1 (overall: 18000.4) avg_msec=10.878 (overall: 14.624) LRANGE_500 (first 500 elements): rps=23365.1 (overall: 18274.9) avg_msec=10.322 (overall: 14.343) LRANGE_500 (first 500 elements): rps=21185.8 (overall: 18417.1) avg_msec=11.738 (overall: 14.196) ====== LRANGE_500 (first 500 elements) ====== 143s 100000 requests completed in 5.38 seconds 143s 50 parallel clients 143s 3 bytes payload 143s keep alive: 1 143s host configuration "save": 3600 1 300 100 60 10000 143s host configuration "appendonly": no 143s multi-thread: no 143s 143s Latency by percentile distribution: 143s 0.000% <= 0.367 milliseconds (cumulative count 10) 143s 50.000% <= 13.167 milliseconds (cumulative count 50010) 143s 75.000% <= 18.511 milliseconds (cumulative count 75010) 143s 87.500% <= 20.831 milliseconds (cumulative count 87570) 143s 93.750% <= 22.271 milliseconds (cumulative count 93780) 143s 96.875% <= 25.375 milliseconds (cumulative count 96880) 143s 98.438% <= 27.807 milliseconds (cumulative count 98450) 143s 99.219% <= 28.575 milliseconds (cumulative count 99240) 143s 99.609% <= 29.023 milliseconds (cumulative count 99610) 143s 99.805% <= 29.311 milliseconds (cumulative count 99820) 143s 99.902% <= 29.535 milliseconds (cumulative count 99920) 143s 99.951% <= 29.679 milliseconds (cumulative count 99970) 143s 99.976% <= 29.743 milliseconds (cumulative count 99980) 143s 99.988% <= 29.823 milliseconds (cumulative count 99990) 143s 99.994% <= 29.967 milliseconds (cumulative count 100000) 143s 100.000% <= 29.967 milliseconds (cumulative count 100000) 143s 143s Cumulative distribution of latencies: 143s 0.000% <= 0.103 milliseconds (cumulative count 0) 143s 0.010% <= 0.407 milliseconds (cumulative count 10) 143s 0.260% <= 0.703 milliseconds (cumulative count 260) 143s 0.350% <= 0.807 milliseconds (cumulative count 350) 143s 0.420% <= 0.903 milliseconds (cumulative count 420) 143s 0.910% <= 1.007 milliseconds (cumulative count 910) 143s 1.150% <= 1.103 milliseconds (cumulative count 1150) 143s 1.330% <= 1.207 milliseconds (cumulative count 1330) 143s 1.510% <= 1.303 milliseconds (cumulative count 1510) 143s 1.600% <= 1.407 milliseconds (cumulative count 1600) 143s 1.740% <= 1.503 milliseconds (cumulative count 1740) 143s 1.810% <= 1.607 milliseconds (cumulative count 1810) 143s 1.860% <= 1.703 milliseconds (cumulative count 1860) 143s 1.890% <= 1.807 milliseconds (cumulative count 1890) 143s 1.910% <= 1.903 milliseconds (cumulative count 1910) 143s 1.960% <= 2.007 milliseconds (cumulative count 1960) 143s 1.990% <= 2.103 milliseconds (cumulative count 1990) 143s 2.490% <= 3.103 milliseconds (cumulative count 2490) 143s 3.160% <= 4.103 milliseconds (cumulative count 3160) 143s 3.850% <= 5.103 milliseconds (cumulative count 3850) 143s 6.270% <= 6.103 milliseconds (cumulative count 6270) 143s 9.440% <= 7.103 milliseconds (cumulative count 9440) 143s 14.390% <= 8.103 milliseconds (cumulative count 14390) 143s 22.160% <= 9.103 milliseconds (cumulative count 22160) 143s 32.600% <= 10.103 milliseconds (cumulative count 32600) 143s 40.740% <= 11.103 milliseconds (cumulative count 40740) 143s 46.190% <= 12.103 milliseconds (cumulative count 46190) 143s 49.810% <= 13.103 milliseconds (cumulative count 49810) 143s 52.760% <= 14.103 milliseconds (cumulative count 52760) 143s 55.140% <= 15.103 milliseconds (cumulative count 55140) 143s 59.370% <= 16.103 milliseconds (cumulative count 59370) 143s 65.680% <= 17.103 milliseconds (cumulative count 65680) 143s 72.400% <= 18.111 milliseconds (cumulative count 72400) 143s 78.620% <= 19.103 milliseconds (cumulative count 78620) 143s 84.130% <= 20.111 milliseconds (cumulative count 84130) 143s 88.850% <= 21.103 milliseconds (cumulative count 88850) 143s 93.230% <= 22.111 milliseconds (cumulative count 93230) 143s 95.420% <= 23.103 milliseconds (cumulative count 95420) 143s 96.030% <= 24.111 milliseconds (cumulative count 96030) 143s 96.690% <= 25.103 milliseconds (cumulative count 96690) 143s 97.270% <= 26.111 milliseconds (cumulative count 97270) 143s 97.780% <= 27.103 milliseconds (cumulative count 97780) 143s 98.850% <= 28.111 milliseconds (cumulative count 98850) 143s 99.640% <= 29.103 milliseconds (cumulative count 99640) 143s 100.000% <= 30.111 milliseconds (cumulative count 100000) 143s 143s Summary: 143s throughput summary: 18597.73 requests per second 143s latency summary (msec): 143s avg min p50 p95 p99 max 143s 13.947 0.360 13.167 22.735 28.351 29.967 151s LRANGE_600 (first 600 elements): rps=2212.0 (overall: 10843.1) avg_msec=23.779 (overall: 23.779) LRANGE_600 (first 600 elements): rps=11716.5 (overall: 11570.5) avg_msec=22.766 (overall: 22.925) LRANGE_600 (first 600 elements): rps=11087.0 (overall: 11351.3) avg_msec=24.628 (overall: 23.679) LRANGE_600 (first 600 elements): rps=12348.7 (overall: 11669.1) avg_msec=19.930 (overall: 22.415) LRANGE_600 (first 600 elements): rps=11764.9 (overall: 11691.6) avg_msec=21.369 (overall: 22.168) LRANGE_600 (first 600 elements): rps=9753.9 (overall: 11317.5) avg_msec=26.307 (overall: 22.857) LRANGE_600 (first 600 elements): rps=11297.6 (overall: 11314.3) avg_msec=23.565 (overall: 22.970) LRANGE_600 (first 600 elements): rps=14221.3 (overall: 11716.0) avg_msec=16.831 (overall: 21.940) LRANGE_600 (first 600 elements): rps=12232.0 (overall: 11778.0) avg_msec=23.577 (overall: 22.144) LRANGE_600 (first 600 elements): rps=11654.8 (overall: 11764.7) avg_msec=22.762 (overall: 22.210) LRANGE_600 (first 600 elements): rps=12402.4 (overall: 11826.6) avg_msec=23.069 (overall: 22.298) LRANGE_600 (first 600 elements): rps=11945.1 (overall: 11837.3) avg_msec=22.860 (overall: 22.349) LRANGE_600 (first 600 elements): rps=11848.6 (overall: 11838.2) avg_msec=22.618 (overall: 22.371) LRANGE_600 (first 600 elements): rps=12484.0 (overall: 11886.5) avg_msec=22.732 (overall: 22.399) LRANGE_600 (first 600 elements): rps=11031.9 (overall: 11826.8) avg_msec=23.199 (overall: 22.451) LRANGE_600 (first 600 elements): rps=12828.7 (overall: 11892.2) avg_msec=22.393 (overall: 22.447) LRANGE_600 (first 600 elements): rps=12249.0 (overall: 11914.3) avg_msec=23.058 (overall: 22.486) LRANGE_600 (first 600 elements): rps=11325.4 (overall: 11880.1) avg_msec=25.774 (overall: 22.668) LRANGE_600 (first 600 elements): rps=11854.3 (overall: 11878.7) avg_msec=22.263 (overall: 22.645) LRANGE_600 (first 600 elements): rps=11617.5 (overall: 11865.2) avg_msec=23.209 (overall: 22.674) LRANGE_600 (first 600 elements): rps=12669.3 (overall: 11904.8) avg_msec=23.173 (overall: 22.700) LRANGE_600 (first 600 elements): rps=11852.1 (overall: 11902.2) avg_msec=22.410 (overall: 22.686) LRANGE_600 (first 600 elements): rps=11613.5 (overall: 11889.3) avg_msec=22.436 (overall: 22.675) LRANGE_600 (first 600 elements): rps=11821.4 (overall: 11886.4) avg_msec=22.673 (overall: 22.675) LRANGE_600 (first 600 elements): rps=12470.1 (overall: 11910.4) avg_msec=22.582 (overall: 22.671) LRANGE_600 (first 600 elements): rps=12319.1 (overall: 11926.9) avg_msec=22.363 (overall: 22.658) LRANGE_600 (first 600 elements): rps=11488.0 (overall: 11910.3) avg_msec=22.612 (overall: 22.657) LRANGE_600 (first 600 elements): rps=11880.5 (overall: 11909.2) avg_msec=22.866 (overall: 22.664) LRANGE_600 (first 600 elements): rps=12645.4 (overall: 11935.1) avg_msec=22.662 (overall: 22.664) LRANGE_600 (first 600 elements): rps=11071.7 (overall: 11905.8) avg_msec=23.566 (overall: 22.693) LRANGE_600 (first 600 elements): rps=11980.1 (overall: 11908.2) avg_msec=22.845 (overall: 22.698) LRANGE_600 (first 600 elements): rps=14828.7 (overall: 12001.3) avg_msec=18.567 (overall: 22.535) LRANGE_600 (first 600 elements): rps=13896.4 (overall: 12059.8) avg_msec=19.154 (overall: 22.415) ====== LRANGE_600 (first 600 elements) ====== 151s 100000 requests completed in 8.28 seconds 151s 50 parallel clients 151s 3 bytes payload 151s keep alive: 1 151s host configuration "save": 3600 1 300 100 60 10000 151s host configuration "appendonly": no 151s multi-thread: no 151s 151s Latency by percentile distribution: 151s 0.000% <= 0.431 milliseconds (cumulative count 10) 151s 50.000% <= 22.991 milliseconds (cumulative count 50020) 151s 75.000% <= 26.479 milliseconds (cumulative count 75030) 151s 87.500% <= 29.839 milliseconds (cumulative count 87510) 151s 93.750% <= 31.823 milliseconds (cumulative count 93760) 151s 96.875% <= 32.831 milliseconds (cumulative count 96920) 151s 98.438% <= 34.463 milliseconds (cumulative count 98450) 151s 99.219% <= 36.863 milliseconds (cumulative count 99220) 151s 99.609% <= 38.591 milliseconds (cumulative count 99610) 151s 99.805% <= 40.159 milliseconds (cumulative count 99810) 151s 99.902% <= 41.951 milliseconds (cumulative count 99910) 151s 99.951% <= 43.647 milliseconds (cumulative count 99960) 151s 99.976% <= 44.063 milliseconds (cumulative count 99980) 151s 99.988% <= 44.255 milliseconds (cumulative count 99990) 151s 99.994% <= 44.447 milliseconds (cumulative count 100000) 151s 100.000% <= 44.447 milliseconds (cumulative count 100000) 151s 151s Cumulative distribution of latencies: 151s 0.000% <= 0.103 milliseconds (cumulative count 0) 151s 0.010% <= 0.503 milliseconds (cumulative count 10) 151s 0.030% <= 0.703 milliseconds (cumulative count 30) 151s 0.090% <= 0.807 milliseconds (cumulative count 90) 151s 0.740% <= 0.903 milliseconds (cumulative count 740) 151s 0.940% <= 1.007 milliseconds (cumulative count 940) 151s 1.250% <= 1.103 milliseconds (cumulative count 1250) 151s 1.950% <= 1.207 milliseconds (cumulative count 1950) 151s 2.260% <= 1.303 milliseconds (cumulative count 2260) 151s 2.550% <= 1.407 milliseconds (cumulative count 2550) 151s 2.820% <= 1.503 milliseconds (cumulative count 2820) 151s 3.150% <= 1.607 milliseconds (cumulative count 3150) 151s 3.400% <= 1.703 milliseconds (cumulative count 3400) 151s 3.640% <= 1.807 milliseconds (cumulative count 3640) 151s 3.840% <= 1.903 milliseconds (cumulative count 3840) 151s 4.060% <= 2.007 milliseconds (cumulative count 4060) 151s 4.190% <= 2.103 milliseconds (cumulative count 4190) 151s 5.340% <= 3.103 milliseconds (cumulative count 5340) 151s 5.800% <= 4.103 milliseconds (cumulative count 5800) 151s 6.040% <= 5.103 milliseconds (cumulative count 6040) 151s 6.230% <= 6.103 milliseconds (cumulative count 6230) 151s 6.470% <= 7.103 milliseconds (cumulative count 6470) 151s 6.700% <= 8.103 milliseconds (cumulative count 6700) 151s 6.850% <= 9.103 milliseconds (cumulative count 6850) 151s 7.220% <= 10.103 milliseconds (cumulative count 7220) 151s 7.910% <= 11.103 milliseconds (cumulative count 7910) 151s 8.750% <= 12.103 milliseconds (cumulative count 8750) 151s 9.420% <= 13.103 milliseconds (cumulative count 9420) 151s 10.430% <= 14.103 milliseconds (cumulative count 10430) 151s 11.470% <= 15.103 milliseconds (cumulative count 11470) 151s 12.520% <= 16.103 milliseconds (cumulative count 12520) 151s 13.620% <= 17.103 milliseconds (cumulative count 13620) 151s 14.750% <= 18.111 milliseconds (cumulative count 14750) 151s 17.380% <= 19.103 milliseconds (cumulative count 17380) 151s 23.720% <= 20.111 milliseconds (cumulative count 23720) 151s 32.870% <= 21.103 milliseconds (cumulative count 32870) 151s 42.310% <= 22.111 milliseconds (cumulative count 42310) 151s 50.930% <= 23.103 milliseconds (cumulative count 50930) 151s 59.060% <= 24.111 milliseconds (cumulative count 59060) 151s 66.970% <= 25.103 milliseconds (cumulative count 66970) 151s 73.470% <= 26.111 milliseconds (cumulative count 73470) 151s 77.490% <= 27.103 milliseconds (cumulative count 77490) 151s 81.510% <= 28.111 milliseconds (cumulative count 81510) 151s 85.080% <= 29.103 milliseconds (cumulative count 85080) 151s 88.240% <= 30.111 milliseconds (cumulative count 88240) 151s 91.110% <= 31.103 milliseconds (cumulative count 91110) 151s 94.830% <= 32.111 milliseconds (cumulative count 94830) 151s 97.350% <= 33.119 milliseconds (cumulative count 97350) 151s 98.190% <= 34.111 milliseconds (cumulative count 98190) 151s 98.820% <= 35.103 milliseconds (cumulative count 98820) 151s 99.090% <= 36.127 milliseconds (cumulative count 99090) 151s 99.300% <= 37.119 milliseconds (cumulative count 99300) 151s 99.520% <= 38.111 milliseconds (cumulative count 99520) 151s 99.700% <= 39.103 milliseconds (cumulative count 99700) 151s 99.800% <= 40.127 milliseconds (cumulative count 99800) 151s 99.860% <= 41.119 milliseconds (cumulative count 99860) 151s 99.910% <= 42.111 milliseconds (cumulative count 99910) 151s 99.930% <= 43.103 milliseconds (cumulative count 99930) 151s 99.980% <= 44.127 milliseconds (cumulative count 99980) 151s 100.000% <= 45.119 milliseconds (cumulative count 100000) 151s 151s Summary: 151s throughput summary: 12074.38 requests per second 151s latency summary (msec): 151s avg min p50 p95 p99 max 151s 22.392 0.424 22.991 32.175 35.615 44.447 152s MSET (10 keys): rps=140757.0 (overall: 375851.1) avg_msec=1.251 (overall: 1.251) ====== MSET (10 keys) ====== 152s 100000 requests completed in 0.27 seconds 152s 50 parallel clients 152s 3 bytes payload 152s keep alive: 1 152s host configuration "save": 3600 1 300 100 60 10000 152s host configuration "appendonly": no 152s multi-thread: no 152s 152s Latency by percentile distribution: 152s 0.000% <= 0.231 milliseconds (cumulative count 10) 152s 50.000% <= 1.279 milliseconds (cumulative count 52980) 152s 75.000% <= 1.343 milliseconds (cumulative count 75780) 152s 87.500% <= 1.391 milliseconds (cumulative count 87890) 152s 93.750% <= 1.439 milliseconds (cumulative count 94090) 152s 96.875% <= 1.487 milliseconds (cumulative count 97170) 152s 98.438% <= 1.527 milliseconds (cumulative count 98440) 152s 99.219% <= 1.615 milliseconds (cumulative count 99250) 152s 99.609% <= 1.695 milliseconds (cumulative count 99610) 152s 99.805% <= 1.775 milliseconds (cumulative count 99820) 152s 99.902% <= 1.823 milliseconds (cumulative count 99910) 152s 99.951% <= 1.871 milliseconds (cumulative count 99960) 152s 99.976% <= 1.887 milliseconds (cumulative count 99980) 152s 99.988% <= 1.903 milliseconds (cumulative count 99990) 152s 99.994% <= 1.927 milliseconds (cumulative count 100000) 152s 100.000% <= 1.927 milliseconds (cumulative count 100000) 152s 152s Cumulative distribution of latencies: 152s 0.000% <= 0.103 milliseconds (cumulative count 0) 152s 0.010% <= 0.303 milliseconds (cumulative count 10) 152s 0.120% <= 0.407 milliseconds (cumulative count 120) 152s 0.250% <= 0.607 milliseconds (cumulative count 250) 152s 0.660% <= 0.703 milliseconds (cumulative count 660) 152s 1.940% <= 0.807 milliseconds (cumulative count 1940) 152s 2.930% <= 0.903 milliseconds (cumulative count 2930) 152s 3.210% <= 1.007 milliseconds (cumulative count 3210) 152s 3.790% <= 1.103 milliseconds (cumulative count 3790) 152s 24.210% <= 1.207 milliseconds (cumulative count 24210) 152s 62.350% <= 1.303 milliseconds (cumulative count 62350) 152s 90.480% <= 1.407 milliseconds (cumulative count 90480) 152s 97.770% <= 1.503 milliseconds (cumulative count 97770) 152s 99.200% <= 1.607 milliseconds (cumulative count 99200) 152s 99.630% <= 1.703 milliseconds (cumulative count 99630) 152s 99.880% <= 1.807 milliseconds (cumulative count 99880) 152s 99.990% <= 1.903 milliseconds (cumulative count 99990) 152s 100.000% <= 2.007 milliseconds (cumulative count 100000) 152s 152s Summary: 152s throughput summary: 374531.84 requests per second 152s latency summary (msec): 152s avg min p50 p95 p99 max 152s 1.270 0.224 1.279 1.455 1.583 1.927 152s XADD: rps=224160.0 (overall: 737368.4) avg_msec=0.608 (overall: 0.608) ====== XADD ====== 152s 100000 requests completed in 0.13 seconds 152s 50 parallel clients 152s 3 bytes payload 152s keep alive: 1 152s host configuration "save": 3600 1 300 100 60 10000 152s host configuration "appendonly": no 152s multi-thread: no 152s 152s Latency by percentile distribution: 152s 0.000% <= 0.135 milliseconds (cumulative count 10) 152s 50.000% <= 0.607 milliseconds (cumulative count 51520) 152s 75.000% <= 0.671 milliseconds (cumulative count 77370) 152s 87.500% <= 0.711 milliseconds (cumulative count 88620) 152s 93.750% <= 0.743 milliseconds (cumulative count 94000) 152s 96.875% <= 0.775 milliseconds (cumulative count 96940) 152s 98.438% <= 0.815 milliseconds (cumulative count 98480) 152s 99.219% <= 0.871 milliseconds (cumulative count 99230) 152s 99.609% <= 0.935 milliseconds (cumulative count 99630) 152s 99.805% <= 1.031 milliseconds (cumulative count 99820) 152s 99.902% <= 1.071 milliseconds (cumulative count 99920) 152s 99.951% <= 1.111 milliseconds (cumulative count 99960) 152s 99.976% <= 1.127 milliseconds (cumulative count 99980) 152s 99.988% <= 1.135 milliseconds (cumulative count 99990) 152s 99.994% <= 1.159 milliseconds (cumulative count 100000) 152s 100.000% <= 1.159 milliseconds (cumulative count 100000) 152s 152s Cumulative distribution of latencies: 152s 0.000% <= 0.103 milliseconds (cumulative count 0) 152s 0.170% <= 0.207 milliseconds (cumulative count 170) 152s 0.560% <= 0.303 milliseconds (cumulative count 560) 152s 3.140% <= 0.407 milliseconds (cumulative count 3140) 152s 12.090% <= 0.503 milliseconds (cumulative count 12090) 152s 51.520% <= 0.607 milliseconds (cumulative count 51520) 152s 86.670% <= 0.703 milliseconds (cumulative count 86670) 152s 98.300% <= 0.807 milliseconds (cumulative count 98300) 152s 99.480% <= 0.903 milliseconds (cumulative count 99480) 152s 99.770% <= 1.007 milliseconds (cumulative count 99770) 152s 99.950% <= 1.103 milliseconds (cumulative count 99950) 152s 100.000% <= 1.207 milliseconds (cumulative count 100000) 152s 152s Summary: 152s throughput summary: 751879.69 requests per second 152s latency summary (msec): 152s avg min p50 p95 p99 max 152s 0.605 0.128 0.607 0.759 0.847 1.159 152s 152s autopkgtest [17:20:19]: test 0002-benchmark: -----------------------] 153s autopkgtest [17:20:20]: test 0002-benchmark: - - - - - - - - - - results - - - - - - - - - - 153s 0002-benchmark PASS 153s autopkgtest [17:20:20]: test 0003-redict-check-aof: preparing testbed 153s Reading package lists... 153s Building dependency tree... 153s Reading state information... 153s Solving dependencies... 153s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 155s autopkgtest [17:20:22]: test 0003-redict-check-aof: [----------------------- 155s autopkgtest [17:20:22]: test 0003-redict-check-aof: -----------------------] 156s 0003-redict-check-aof PASS 156s autopkgtest [17:20:23]: test 0003-redict-check-aof: - - - - - - - - - - results - - - - - - - - - - 156s autopkgtest [17:20:23]: test 0004-redict-check-rdb: preparing testbed 156s Reading package lists... 156s Building dependency tree... 156s Reading state information... 157s Solving dependencies... 157s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 158s autopkgtest [17:20:25]: test 0004-redict-check-rdb: [----------------------- 163s OK 163s [offset 0] Checking RDB file /var/lib/redict/dump.rdb 163s [offset 26] AUX FIELD redis-ver = '7.3.5' 163s [offset 40] AUX FIELD redis-bits = '64' 163s [offset 52] AUX FIELD ctime = '1762276830' 163s [offset 67] AUX FIELD used-mem = '3202128' 163s [offset 79] AUX FIELD aof-base = '0' 163s [offset 81] Selecting DB ID 0 163s [offset 564988] Checksum OK 163s [offset 564988] \o/ RDB looks OK! \o/ 163s [info] 5 keys read 163s [info] 0 expires 163s [info] 0 already expired 164s autopkgtest [17:20:31]: test 0004-redict-check-rdb: -----------------------] 164s 0004-redict-check-rdb PASS 164s autopkgtest [17:20:31]: test 0004-redict-check-rdb: - - - - - - - - - - results - - - - - - - - - - 164s autopkgtest [17:20:31]: test 0005-cjson: preparing testbed 165s Reading package lists... 165s Building dependency tree... 165s Reading state information... 165s Solving dependencies... 165s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 166s autopkgtest [17:20:33]: test 0005-cjson: [----------------------- 172s 172s autopkgtest [17:20:39]: test 0005-cjson: -----------------------] 172s autopkgtest [17:20:39]: test 0005-cjson: - - - - - - - - - - results - - - - - - - - - - 172s 0005-cjson PASS 173s autopkgtest [17:20:40]: @@@@@@@@@@@@@@@@@@@@ summary 173s 0001-redict-cli PASS 173s 0002-benchmark PASS 173s 0003-redict-check-aof PASS 173s 0004-redict-check-rdb PASS 173s 0005-cjson PASS