0s autopkgtest [16:10:44]: starting date and time: 2025-03-15 16:10:44+0000 0s autopkgtest [16:10:44]: git checkout: 325255d2 Merge branch 'pin-any-arch' into 'ubuntu/production' 0s autopkgtest [16:10:44]: host juju-7f2275-prod-proposed-migration-environment-2; command line: /home/ubuntu/autopkgtest/runner/autopkgtest --output-dir /tmp/autopkgtest-work.b2k53e1d/out --timeout-copy=6000 --setup-commands /home/ubuntu/autopkgtest-cloud/worker-config-production/setup-canonical.sh --apt-pocket=proposed=src:glibc --apt-upgrade redis --timeout-short=300 --timeout-copy=20000 --timeout-build=20000 --env=ADT_TEST_TRIGGERS=glibc/2.41-1ubuntu2 -- ssh -s /home/ubuntu/autopkgtest/ssh-setup/nova -- --flavor autopkgtest --security-groups autopkgtest-juju-7f2275-prod-proposed-migration-environment-2@bos03-arm64-17.secgroup --name adt-plucky-arm64-redis-20250315-161044-juju-7f2275-prod-proposed-migration-environment-2-f907425e-6e99-464e-8d26-aa1612b7b34e --image adt/ubuntu-plucky-arm64-server --keyname testbed-juju-7f2275-prod-proposed-migration-environment-2 --net-id=net_prod-proposed-migration -e TERM=linux -e ''"'"'http_proxy=http://squid.internal:3128'"'"'' -e ''"'"'https_proxy=http://squid.internal:3128'"'"'' -e ''"'"'no_proxy=127.0.0.1,127.0.1.1,login.ubuntu.com,localhost,localdomain,novalocal,internal,archive.ubuntu.com,ports.ubuntu.com,security.ubuntu.com,ddebs.ubuntu.com,changelogs.ubuntu.com,keyserver.ubuntu.com,launchpadlibrarian.net,launchpadcontent.net,launchpad.net,10.24.0.0/24,keystone.ps5.canonical.com,objectstorage.prodstack5.canonical.com,radosgw.ps5.canonical.com'"'"'' --mirror=http://ftpmaster.internal/ubuntu/ 177s autopkgtest [16:13:41]: testbed dpkg architecture: arm64 177s autopkgtest [16:13:41]: testbed apt version: 2.9.33 177s autopkgtest [16:13:41]: @@@@@@@@@@@@@@@@@@@@ test bed setup 178s autopkgtest [16:13:42]: testbed release detected to be: None 179s autopkgtest [16:13:43]: updating testbed package index (apt update) 179s Get:1 http://ftpmaster.internal/ubuntu plucky-proposed InRelease [126 kB] 180s Hit:2 http://ftpmaster.internal/ubuntu plucky InRelease 180s Hit:3 http://ftpmaster.internal/ubuntu plucky-updates InRelease 180s Hit:4 http://ftpmaster.internal/ubuntu plucky-security InRelease 180s Get:5 http://ftpmaster.internal/ubuntu plucky-proposed/universe Sources [379 kB] 181s Get:6 http://ftpmaster.internal/ubuntu plucky-proposed/multiverse Sources [15.8 kB] 181s Get:7 http://ftpmaster.internal/ubuntu plucky-proposed/main Sources [99.7 kB] 181s Get:8 http://ftpmaster.internal/ubuntu plucky-proposed/main arm64 Packages [111 kB] 181s Get:9 http://ftpmaster.internal/ubuntu plucky-proposed/main arm64 c-n-f Metadata [1856 B] 181s Get:10 http://ftpmaster.internal/ubuntu plucky-proposed/restricted arm64 c-n-f Metadata [116 B] 181s Get:11 http://ftpmaster.internal/ubuntu plucky-proposed/universe arm64 Packages [324 kB] 182s Get:12 http://ftpmaster.internal/ubuntu plucky-proposed/universe arm64 c-n-f Metadata [14.7 kB] 182s Get:13 http://ftpmaster.internal/ubuntu plucky-proposed/multiverse arm64 Packages [4948 B] 182s Get:14 http://ftpmaster.internal/ubuntu plucky-proposed/multiverse arm64 c-n-f Metadata [268 B] 182s Fetched 1078 kB in 2s (449 kB/s) 183s Reading package lists... 184s Reading package lists... 184s Building dependency tree... 184s Reading state information... 185s Calculating upgrade... 185s Calculating upgrade... 186s The following packages will be upgraded: 186s pinentry-curses python3-jinja2 strace 186s 3 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 186s Need to get 647 kB of archives. 186s After this operation, 11.3 kB of additional disk space will be used. 186s Get:1 http://ftpmaster.internal/ubuntu plucky/main arm64 strace arm64 6.13+ds-1ubuntu1 [499 kB] 187s Get:2 http://ftpmaster.internal/ubuntu plucky/main arm64 pinentry-curses arm64 1.3.1-2ubuntu3 [39.2 kB] 187s Get:3 http://ftpmaster.internal/ubuntu plucky/main arm64 python3-jinja2 all 3.1.5-2ubuntu1 [109 kB] 187s Fetched 647 kB in 1s (609 kB/s) 188s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117701 files and directories currently installed.) 188s Preparing to unpack .../strace_6.13+ds-1ubuntu1_arm64.deb ... 188s Unpacking strace (6.13+ds-1ubuntu1) over (6.11-0ubuntu1) ... 188s Preparing to unpack .../pinentry-curses_1.3.1-2ubuntu3_arm64.deb ... 188s Unpacking pinentry-curses (1.3.1-2ubuntu3) over (1.3.1-2ubuntu2) ... 188s Preparing to unpack .../python3-jinja2_3.1.5-2ubuntu1_all.deb ... 188s Unpacking python3-jinja2 (3.1.5-2ubuntu1) over (3.1.5-2) ... 188s Setting up pinentry-curses (1.3.1-2ubuntu3) ... 188s Setting up python3-jinja2 (3.1.5-2ubuntu1) ... 188s Setting up strace (6.13+ds-1ubuntu1) ... 188s Processing triggers for man-db (2.13.0-1) ... 189s Reading package lists... 189s Building dependency tree... 189s Reading state information... 190s Solving dependencies... 190s The following packages will be REMOVED: 190s libnsl2* libpython3.12-minimal* libpython3.12-stdlib* libpython3.12t64* 190s libunwind8* linux-headers-6.11.0-8* linux-headers-6.11.0-8-generic* 190s linux-image-6.11.0-8-generic* linux-modules-6.11.0-8-generic* 190s linux-tools-6.11.0-8* linux-tools-6.11.0-8-generic* 191s 0 upgraded, 0 newly installed, 11 to remove and 5 not upgraded. 191s After this operation, 267 MB disk space will be freed. 191s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117701 files and directories currently installed.) 191s Removing linux-tools-6.11.0-8-generic (6.11.0-8.8) ... 191s Removing linux-tools-6.11.0-8 (6.11.0-8.8) ... 191s Removing libpython3.12t64:arm64 (3.12.9-1) ... 191s Removing libpython3.12-stdlib:arm64 (3.12.9-1) ... 191s Removing libnsl2:arm64 (1.3.0-3build3) ... 191s Removing libpython3.12-minimal:arm64 (3.12.9-1) ... 191s Removing libunwind8:arm64 (1.6.2-3.1) ... 191s Removing linux-headers-6.11.0-8-generic (6.11.0-8.8) ... 192s Removing linux-headers-6.11.0-8 (6.11.0-8.8) ... 193s Removing linux-image-6.11.0-8-generic (6.11.0-8.8) ... 193s I: /boot/vmlinuz.old is now a symlink to vmlinuz-6.14.0-10-generic 193s I: /boot/initrd.img.old is now a symlink to initrd.img-6.14.0-10-generic 193s /etc/kernel/postrm.d/initramfs-tools: 193s update-initramfs: Deleting /boot/initrd.img-6.11.0-8-generic 193s /etc/kernel/postrm.d/zz-flash-kernel: 193s flash-kernel: Kernel 6.11.0-8-generic has been removed. 194s flash-kernel: A higher version (6.14.0-10-generic) is still installed, no reflashing required. 194s /etc/kernel/postrm.d/zz-update-grub: 194s Sourcing file `/etc/default/grub' 194s Sourcing file `/etc/default/grub.d/50-cloudimg-settings.cfg' 194s Generating grub configuration file ... 194s Found linux image: /boot/vmlinuz-6.14.0-10-generic 194s Found initrd image: /boot/initrd.img-6.14.0-10-generic 195s Warning: os-prober will not be executed to detect other bootable partitions. 195s Systems on them will not be added to the GRUB boot configuration. 195s Check GRUB_DISABLE_OS_PROBER documentation entry. 195s Adding boot menu entry for UEFI Firmware Settings ... 195s done 195s Removing linux-modules-6.11.0-8-generic (6.11.0-8.8) ... 195s Processing triggers for libc-bin (2.41-1ubuntu1) ... 195s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81650 files and directories currently installed.) 195s Purging configuration files for linux-image-6.11.0-8-generic (6.11.0-8.8) ... 195s Purging configuration files for libpython3.12-minimal:arm64 (3.12.9-1) ... 195s Purging configuration files for linux-modules-6.11.0-8-generic (6.11.0-8.8) ... 195s autopkgtest [16:13:59]: upgrading testbed (apt dist-upgrade and autopurge) 196s Reading package lists... 196s Building dependency tree... 196s Reading state information... 197s Calculating upgrade...Starting pkgProblemResolver with broken count: 0 197s Starting 2 pkgProblemResolver with broken count: 0 197s Done 198s Entering ResolveByKeep 198s 198s Calculating upgrade... 199s The following packages will be upgraded: 199s libc-bin libc-dev-bin libc6 libc6-dev locales 199s 5 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 199s Need to get 9530 kB of archives. 199s After this operation, 0 B of additional disk space will be used. 199s Get:1 http://ftpmaster.internal/ubuntu plucky-proposed/main arm64 libc6-dev arm64 2.41-1ubuntu2 [1750 kB] 201s Get:2 http://ftpmaster.internal/ubuntu plucky-proposed/main arm64 libc-dev-bin arm64 2.41-1ubuntu2 [24.0 kB] 201s Get:3 http://ftpmaster.internal/ubuntu plucky-proposed/main arm64 libc6 arm64 2.41-1ubuntu2 [2910 kB] 204s Get:4 http://ftpmaster.internal/ubuntu plucky-proposed/main arm64 libc-bin arm64 2.41-1ubuntu2 [600 kB] 204s Get:5 http://ftpmaster.internal/ubuntu plucky-proposed/main arm64 locales all 2.41-1ubuntu2 [4246 kB] 210s Preconfiguring packages ... 210s Fetched 9530 kB in 11s (892 kB/s) 210s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81647 files and directories currently installed.) 210s Preparing to unpack .../libc6-dev_2.41-1ubuntu2_arm64.deb ... 210s Unpacking libc6-dev:arm64 (2.41-1ubuntu2) over (2.41-1ubuntu1) ... 210s Preparing to unpack .../libc-dev-bin_2.41-1ubuntu2_arm64.deb ... 210s Unpacking libc-dev-bin (2.41-1ubuntu2) over (2.41-1ubuntu1) ... 210s Preparing to unpack .../libc6_2.41-1ubuntu2_arm64.deb ... 211s Unpacking libc6:arm64 (2.41-1ubuntu2) over (2.41-1ubuntu1) ... 211s Setting up libc6:arm64 (2.41-1ubuntu2) ... 211s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81647 files and directories currently installed.) 211s Preparing to unpack .../libc-bin_2.41-1ubuntu2_arm64.deb ... 211s Unpacking libc-bin (2.41-1ubuntu2) over (2.41-1ubuntu1) ... 211s Setting up libc-bin (2.41-1ubuntu2) ... 211s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81647 files and directories currently installed.) 211s Preparing to unpack .../locales_2.41-1ubuntu2_all.deb ... 211s Unpacking locales (2.41-1ubuntu2) over (2.41-1ubuntu1) ... 211s Setting up locales (2.41-1ubuntu2) ... 212s Generating locales (this might take a while)... 214s en_US.UTF-8... done 214s Generation complete. 214s Setting up libc-dev-bin (2.41-1ubuntu2) ... 214s Setting up libc6-dev:arm64 (2.41-1ubuntu2) ... 214s Processing triggers for man-db (2.13.0-1) ... 215s Processing triggers for systemd (257.3-1ubuntu3) ... 216s Reading package lists... 217s Building dependency tree... 217s Reading state information... 217s Starting pkgProblemResolver with broken count: 0 217s Starting 2 pkgProblemResolver with broken count: 0 217s Done 218s Solving dependencies... 218s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 218s autopkgtest [16:14:22]: rebooting testbed after setup commands that affected boot 242s autopkgtest [16:14:46]: testbed running kernel: Linux 6.14.0-10-generic #10-Ubuntu SMP PREEMPT_DYNAMIC Wed Mar 12 15:45:31 UTC 2025 245s autopkgtest [16:14:49]: @@@@@@@@@@@@@@@@@@@@ apt-source redis 252s Get:1 http://ftpmaster.internal/ubuntu plucky/universe redis 5:7.0.15-3 (dsc) [2273 B] 252s Get:2 http://ftpmaster.internal/ubuntu plucky/universe redis 5:7.0.15-3 (tar) [3026 kB] 252s Get:3 http://ftpmaster.internal/ubuntu plucky/universe redis 5:7.0.15-3 (diff) [31.7 kB] 252s gpgv: Signature made Tue Jan 21 10:13:21 2025 UTC 252s gpgv: using RSA key C2FE4BD271C139B86C533E461E953E27D4311E58 252s gpgv: Can't check signature: No public key 252s dpkg-source: warning: cannot verify inline signature for ./redis_7.0.15-3.dsc: no acceptable signature found 252s autopkgtest [16:14:56]: testing package redis version 5:7.0.15-3 253s autopkgtest [16:14:57]: build not needed 255s autopkgtest [16:14:59]: test 0001-redis-cli: preparing testbed 255s Reading package lists... 256s Building dependency tree... 256s Reading state information... 256s Starting pkgProblemResolver with broken count: 0 256s Starting 2 pkgProblemResolver with broken count: 0 256s Done 257s The following NEW packages will be installed: 257s liblzf1 redis redis-sentinel redis-server redis-tools 257s 0 upgraded, 5 newly installed, 0 to remove and 0 not upgraded. 257s Need to get 1239 kB of archives. 257s After this operation, 7304 kB of additional disk space will be used. 257s Get:1 http://ftpmaster.internal/ubuntu plucky/universe arm64 liblzf1 arm64 3.6-4 [7426 B] 257s Get:2 http://ftpmaster.internal/ubuntu plucky/universe arm64 redis-tools arm64 5:7.0.15-3 [1165 kB] 259s Get:3 http://ftpmaster.internal/ubuntu plucky/universe arm64 redis-sentinel arm64 5:7.0.15-3 [12.2 kB] 259s Get:4 http://ftpmaster.internal/ubuntu plucky/universe arm64 redis-server arm64 5:7.0.15-3 [51.7 kB] 259s Get:5 http://ftpmaster.internal/ubuntu plucky/universe arm64 redis all 5:7.0.15-3 [2914 B] 259s Fetched 1239 kB in 2s (691 kB/s) 259s Selecting previously unselected package liblzf1:arm64. 259s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 81647 files and directories currently installed.) 259s Preparing to unpack .../liblzf1_3.6-4_arm64.deb ... 259s Unpacking liblzf1:arm64 (3.6-4) ... 259s Selecting previously unselected package redis-tools. 259s Preparing to unpack .../redis-tools_5%3a7.0.15-3_arm64.deb ... 259s Unpacking redis-tools (5:7.0.15-3) ... 260s Selecting previously unselected package redis-sentinel. 260s Preparing to unpack .../redis-sentinel_5%3a7.0.15-3_arm64.deb ... 260s Unpacking redis-sentinel (5:7.0.15-3) ... 260s Selecting previously unselected package redis-server. 260s Preparing to unpack .../redis-server_5%3a7.0.15-3_arm64.deb ... 260s Unpacking redis-server (5:7.0.15-3) ... 260s Selecting previously unselected package redis. 260s Preparing to unpack .../redis_5%3a7.0.15-3_all.deb ... 260s Unpacking redis (5:7.0.15-3) ... 260s Setting up liblzf1:arm64 (3.6-4) ... 260s Setting up redis-tools (5:7.0.15-3) ... 260s Setting up redis-server (5:7.0.15-3) ... 260s Created symlink '/etc/systemd/system/redis.service' → '/usr/lib/systemd/system/redis-server.service'. 260s Created symlink '/etc/systemd/system/multi-user.target.wants/redis-server.service' → '/usr/lib/systemd/system/redis-server.service'. 261s Setting up redis-sentinel (5:7.0.15-3) ... 261s Created symlink '/etc/systemd/system/sentinel.service' → '/usr/lib/systemd/system/redis-sentinel.service'. 261s Created symlink '/etc/systemd/system/multi-user.target.wants/redis-sentinel.service' → '/usr/lib/systemd/system/redis-sentinel.service'. 261s Setting up redis (5:7.0.15-3) ... 261s Processing triggers for man-db (2.13.0-1) ... 262s Processing triggers for libc-bin (2.41-1ubuntu2) ... 264s autopkgtest [16:15:08]: test 0001-redis-cli: [----------------------- 269s # Server 269s redis_version:7.0.15 269s redis_git_sha1:00000000 269s redis_git_dirty:0 269s redis_build_id:1369a98afcafaf0 269s redis_mode:standalone 269s os:Linux 6.14.0-10-generic aarch64 269s arch_bits:64 269s monotonic_clock:POSIX clock_gettime 269s multiplexing_api:epoll 269s atomicvar_api:c11-builtin 269s gcc_version:14.2.0 269s process_id:1770 269s process_supervised:systemd 269s run_id:045cc67235eed0fee54d1e3a4c99c3bb235e7b6b 269s tcp_port:6379 269s server_time_usec:1742055313675709 269s uptime_in_seconds:4 269s uptime_in_days:0 269s hz:10 269s configured_hz:10 269s lru_clock:14002065 269s executable:/usr/bin/redis-server 269s config_file:/etc/redis/redis.conf 269s io_threads_active:0 269s 269s # Clients 269s connected_clients:3 269s cluster_connections:0 269s maxclients:10000 269s client_recent_max_input_buffer:20480 269s client_recent_max_output_buffer:0 269s blocked_clients:0 269s tracking_clients:0 269s clients_in_timeout_table:0 269s 269s # Memory 269s used_memory:1094208 269s used_memory_human:1.04M 269s used_memory_rss:13246464 269s used_memory_rss_human:12.63M 269s used_memory_peak:1094208 269s used_memory_peak_human:1.04M 269s used_memory_peak_perc:102.15% 269s used_memory_overhead:953664 269s used_memory_startup:908864 269s used_memory_dataset:140544 269s used_memory_dataset_perc:75.83% 269s allocator_allocated:4599520 269s allocator_active:9371648 269s allocator_resident:10551296 269s total_system_memory:4088066048 269s total_system_memory_human:3.81G 269s used_memory_lua:31744 269s used_memory_vm_eval:31744 269s used_memory_lua_human:31.00K 269s used_memory_scripts_eval:0 269s number_of_cached_scripts:0 269s number_of_functions:0 269s number_of_libraries:0 269s used_memory_vm_functions:32768 269s used_memory_vm_total:64512 269s used_memory_vm_total_human:63.00K 269s used_memory_functions:200 269s used_memory_scripts:200 269s used_memory_scripts_human:200B 269s maxmemory:0 269s maxmemory_human:0B 269s maxmemory_policy:noeviction 269s allocator_frag_ratio:2.04 269s allocator_frag_bytes:4772128 269s allocator_rss_ratio:1.13 269s allocator_rss_bytes:1179648 269s rss_overhead_ratio:1.26 269s rss_overhead_bytes:2695168 269s mem_fragmentation_ratio:12.57 269s mem_fragmentation_bytes:12192856 269s mem_not_counted_for_evict:0 269s mem_replication_backlog:0 269s mem_total_replication_buffers:0 269s mem_clients_slaves:0 269s mem_clients_normal:44600 269s mem_cluster_links:0 269s mem_aof_buffer:0 269s mem_allocator:jemalloc-5.3.0 269s active_defrag_running:0 269s lazyfree_pending_objects:0 269s lazyfreed_objects:0 269s 269s # Persistence 269s loading:0 269s async_loading:0 269s current_cow_peak:0 269s current_cow_size:0 269s current_cow_size_age:0 269s current_fork_perc:0.00 269s current_save_keys_processed:0 269s current_save_keys_total:0 269s rdb_changes_since_last_save:0 269s rdb_bgsave_in_progress:0 269s rdb_last_save_time:1742055309 269s rdb_last_bgsave_status:ok 269s rdb_last_bgsave_time_sec:-1 269s rdb_current_bgsave_time_sec:-1 269s rdb_saves:0 269s rdb_last_cow_size:0 269s rdb_last_load_keys_expired:0 269s rdb_last_load_keys_loaded:0 269s aof_enabled:0 269s aof_rewrite_in_progress:0 269s aof_rewrite_scheduled:0 269s aof_last_rewrite_time_sec:-1 269s aof_current_rewrite_time_sec:-1 269s aof_last_bgrewrite_status:ok 269s aof_rewrites:0 269s aof_rewrites_consecutive_failures:0 269s aof_last_write_status:ok 269s aof_last_cow_size:0 269s module_fork_in_progress:0 269s module_fork_last_cow_size:0 269s 269s # Stats 269s total_connections_received:3 269s total_commands_processed:8 269s instantaneous_ops_per_sec:0 269s total_net_input_bytes:483 269s total_net_output_bytes:353 269s total_net_repl_input_bytes:0 269s total_net_repl_output_bytes:0 269s instantaneous_input_kbps:0.00 269s instantaneous_output_kbps:0.00 269s instantaneous_input_repl_kbps:0.00 269s instantaneous_output_repl_kbps:0.00 269s rejected_connections:0 269s sync_full:0 269s sync_partial_ok:0 269s sync_partial_err:0 269s expired_keys:0 269s expired_stale_perc:0.00 269s expired_time_cap_reached_count:0 269s expire_cycle_cpu_milliseconds:0 269s evicted_keys:0 269s evicted_clients:0 269s total_eviction_exceeded_time:0 269s current_eviction_exceeded_time:0 269s keyspace_hits:0 269s keyspace_misses:0 269s pubsub_channels:1 269s pubsub_patterns:0 269s pubsubshard_channels:0 269s latest_fork_usec:0 269s total_forks:0 269s migrate_cached_sockets:0 269s slave_expires_tracked_keys:0 269s active_defrag_hits:0 269s active_defrag_misses:0 269s active_defrag_key_hits:0 269s active_defrag_key_misses:0 269s total_active_defrag_time:0 269s current_active_defrag_time:0 269s tracking_total_keys:0 269s tracking_total_items:0 269s tracking_total_prefixes:0 269s unexpected_error_replies:0 269s total_error_replies:0 269s dump_payload_sanitizations:0 269s total_reads_processed:7 269s total_writes_processed:8 269s io_threaded_reads_processed:0 269s io_threaded_writes_processed:0 269s reply_buffer_shrinks:2 269s reply_buffer_expands:0 269s 269s # Replication 269s role:master 269s connected_slaves:0 269s master_failover_state:no-failover 269s master_replid:f475da1de237cacc5c24fd365a5de8364b917af8 269s master_replid2:0000000000000000000000000000000000000000 269s master_repl_offset:0 269s second_repl_offset:-1 269s repl_backlog_active:0 269s repl_backlog_size:1048576 269s repl_backlog_first_byte_offset:0 269s repl_backlog_histlen:0 269s 269s # CPU 269s used_cpu_sys:0.024729 269s used_cpu_user:0.032941 269s used_cpu_sys_children:0.003161 269s used_cpu_user_children:0.000000 269s used_cpu_sys_main_thread:0.024667 269s used_cpu_user_main_thread:0.032859 269s 269s # Modules 269s 269s # Errorstats 269s 269s # Cluster 269s cluster_enabled:0 269s 269s # Keyspace 269s Redis ver. 7.0.15 270s autopkgtest [16:15:14]: test 0001-redis-cli: -----------------------] 270s autopkgtest [16:15:14]: test 0001-redis-cli: - - - - - - - - - - results - - - - - - - - - - 270s 0001-redis-cli PASS 270s autopkgtest [16:15:14]: test 0002-benchmark: preparing testbed 271s Reading package lists... 271s Building dependency tree... 271s Reading state information... 271s Starting pkgProblemResolver with broken count: 0 271s Starting 2 pkgProblemResolver with broken count: 0 271s Done 272s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 274s autopkgtest [16:15:18]: test 0002-benchmark: [----------------------- 280s PING_INLINE: rps=0.0 (overall: 0.0) avg_msec=nan (overall: nan) ====== PING_INLINE ====== 280s 100000 requests completed in 0.13 seconds 280s 50 parallel clients 280s 3 bytes payload 280s keep alive: 1 280s host configuration "save": 3600 1 300 100 60 10000 280s host configuration "appendonly": no 280s multi-thread: no 280s 280s Latency by percentile distribution: 280s 0.000% <= 0.207 milliseconds (cumulative count 20) 280s 50.000% <= 0.519 milliseconds (cumulative count 51820) 280s 75.000% <= 0.615 milliseconds (cumulative count 75580) 280s 87.500% <= 0.695 milliseconds (cumulative count 87670) 280s 93.750% <= 0.751 milliseconds (cumulative count 94250) 280s 96.875% <= 0.799 milliseconds (cumulative count 97080) 280s 98.438% <= 0.863 milliseconds (cumulative count 98490) 280s 99.219% <= 0.983 milliseconds (cumulative count 99240) 280s 99.609% <= 1.071 milliseconds (cumulative count 99620) 280s 99.805% <= 1.199 milliseconds (cumulative count 99810) 280s 99.902% <= 1.319 milliseconds (cumulative count 99910) 280s 99.951% <= 1.431 milliseconds (cumulative count 99960) 280s 99.976% <= 1.463 milliseconds (cumulative count 99980) 280s 99.988% <= 1.479 milliseconds (cumulative count 99990) 280s 99.994% <= 1.487 milliseconds (cumulative count 100000) 280s 100.000% <= 1.487 milliseconds (cumulative count 100000) 280s 280s Cumulative distribution of latencies: 280s 0.000% <= 0.103 milliseconds (cumulative count 0) 280s 0.020% <= 0.207 milliseconds (cumulative count 20) 280s 1.730% <= 0.303 milliseconds (cumulative count 1730) 280s 13.600% <= 0.407 milliseconds (cumulative count 13600) 280s 45.160% <= 0.503 milliseconds (cumulative count 45160) 280s 74.290% <= 0.607 milliseconds (cumulative count 74290) 280s 88.750% <= 0.703 milliseconds (cumulative count 88750) 280s 97.370% <= 0.807 milliseconds (cumulative count 97370) 280s 98.820% <= 0.903 milliseconds (cumulative count 98820) 280s 99.360% <= 1.007 milliseconds (cumulative count 99360) 280s 99.680% <= 1.103 milliseconds (cumulative count 99680) 280s 99.820% <= 1.207 milliseconds (cumulative count 99820) 280s 99.900% <= 1.303 milliseconds (cumulative count 99900) 280s 99.940% <= 1.407 milliseconds (cumulative count 99940) 280s 100.000% <= 1.503 milliseconds (cumulative count 100000) 280s 280s Summary: 280s throughput summary: 787401.56 requests per second 280s latency summary (msec): 280s avg min p50 p95 p99 max 280s 0.538 0.200 0.519 0.759 0.935 1.487 280s PING_MBULK: rps=377520.0 (overall: 773606.6) avg_msec=0.351 (overall: 0.351) ====== PING_MBULK ====== 280s 100000 requests completed in 0.13 seconds 280s 50 parallel clients 280s 3 bytes payload 280s keep alive: 1 280s host configuration "save": 3600 1 300 100 60 10000 280s host configuration "appendonly": no 280s multi-thread: no 280s 280s Latency by percentile distribution: 280s 0.000% <= 0.111 milliseconds (cumulative count 10) 280s 50.000% <= 0.335 milliseconds (cumulative count 54580) 280s 75.000% <= 0.367 milliseconds (cumulative count 76810) 280s 87.500% <= 0.407 milliseconds (cumulative count 88110) 280s 93.750% <= 0.455 milliseconds (cumulative count 93840) 280s 96.875% <= 0.527 milliseconds (cumulative count 97050) 280s 98.438% <= 0.663 milliseconds (cumulative count 98480) 280s 99.219% <= 0.775 milliseconds (cumulative count 99230) 280s 99.609% <= 1.263 milliseconds (cumulative count 99630) 280s 99.805% <= 1.911 milliseconds (cumulative count 99810) 280s 99.902% <= 1.935 milliseconds (cumulative count 99950) 280s 99.951% <= 1.943 milliseconds (cumulative count 99960) 280s 99.976% <= 1.951 milliseconds (cumulative count 99990) 280s 99.994% <= 1.959 milliseconds (cumulative count 100000) 280s 100.000% <= 1.959 milliseconds (cumulative count 100000) 280s 280s Cumulative distribution of latencies: 280s 0.000% <= 0.103 milliseconds (cumulative count 0) 280s 0.030% <= 0.207 milliseconds (cumulative count 30) 280s 21.680% <= 0.303 milliseconds (cumulative count 21680) 280s 88.110% <= 0.407 milliseconds (cumulative count 88110) 280s 96.400% <= 0.503 milliseconds (cumulative count 96400) 280s 97.930% <= 0.607 milliseconds (cumulative count 97930) 280s 98.870% <= 0.703 milliseconds (cumulative count 98870) 280s 99.330% <= 0.807 milliseconds (cumulative count 99330) 280s 99.430% <= 0.903 milliseconds (cumulative count 99430) 280s 99.440% <= 1.007 milliseconds (cumulative count 99440) 280s 99.490% <= 1.207 milliseconds (cumulative count 99490) 280s 99.710% <= 1.303 milliseconds (cumulative count 99710) 280s 99.730% <= 1.407 milliseconds (cumulative count 99730) 280s 99.780% <= 1.903 milliseconds (cumulative count 99780) 280s 100.000% <= 2.007 milliseconds (cumulative count 100000) 280s 280s Summary: 280s throughput summary: 769230.81 requests per second 280s latency summary (msec): 280s avg min p50 p95 p99 max 280s 0.354 0.104 0.335 0.479 0.727 1.959 280s ====== SET ====== 280s 100000 requests completed in 0.14 seconds 280s 50 parallel clients 280s 3 bytes payload 280s keep alive: 1 280s host configuration "save": 3600 1 300 100 60 10000 280s host configuration "appendonly": no 280s multi-thread: no 280s 280s Latency by percentile distribution: 280s 0.000% <= 0.231 milliseconds (cumulative count 10) 280s 50.000% <= 0.607 milliseconds (cumulative count 52590) 280s 75.000% <= 0.703 milliseconds (cumulative count 75160) 280s 87.500% <= 0.791 milliseconds (cumulative count 87890) 280s 93.750% <= 0.847 milliseconds (cumulative count 94430) 280s 96.875% <= 0.879 milliseconds (cumulative count 96920) 280s 98.438% <= 0.919 milliseconds (cumulative count 98620) 280s 99.219% <= 0.951 milliseconds (cumulative count 99310) 280s 99.609% <= 0.983 milliseconds (cumulative count 99630) 280s 99.805% <= 1.015 milliseconds (cumulative count 99830) 280s 99.902% <= 1.039 milliseconds (cumulative count 99920) 280s 99.951% <= 1.055 milliseconds (cumulative count 99960) 280s 99.976% <= 1.071 milliseconds (cumulative count 99980) 280s 99.988% <= 1.095 milliseconds (cumulative count 99990) 280s 99.994% <= 1.143 milliseconds (cumulative count 100000) 280s 100.000% <= 1.143 milliseconds (cumulative count 100000) 280s 280s Cumulative distribution of latencies: 280s 0.000% <= 0.103 milliseconds (cumulative count 0) 280s 0.540% <= 0.303 milliseconds (cumulative count 540) 280s 5.960% <= 0.407 milliseconds (cumulative count 5960) 280s 19.390% <= 0.503 milliseconds (cumulative count 19390) 280s 52.590% <= 0.607 milliseconds (cumulative count 52590) 280s 75.160% <= 0.703 milliseconds (cumulative count 75160) 280s 89.800% <= 0.807 milliseconds (cumulative count 89800) 280s 98.130% <= 0.903 milliseconds (cumulative count 98130) 280s 99.800% <= 1.007 milliseconds (cumulative count 99800) 280s 99.990% <= 1.103 milliseconds (cumulative count 99990) 280s 100.000% <= 1.207 milliseconds (cumulative count 100000) 280s 280s Summary: 280s throughput summary: 694444.50 requests per second 280s latency summary (msec): 280s avg min p50 p95 p99 max 280s 0.614 0.224 0.607 0.855 0.935 1.143 280s GET: rps=279761.0 (overall: 739157.9) avg_msec=0.534 (overall: 0.534) ====== GET ====== 280s 100000 requests completed in 0.14 seconds 280s 50 parallel clients 280s 3 bytes payload 280s keep alive: 1 280s host configuration "save": 3600 1 300 100 60 10000 280s host configuration "appendonly": no 280s multi-thread: no 280s 280s Latency by percentile distribution: 280s 0.000% <= 0.215 milliseconds (cumulative count 20) 280s 50.000% <= 0.535 milliseconds (cumulative count 50950) 280s 75.000% <= 0.639 milliseconds (cumulative count 76340) 280s 87.500% <= 0.727 milliseconds (cumulative count 88250) 280s 93.750% <= 0.783 milliseconds (cumulative count 93900) 280s 96.875% <= 0.831 milliseconds (cumulative count 97140) 280s 98.438% <= 0.879 milliseconds (cumulative count 98530) 280s 99.219% <= 0.919 milliseconds (cumulative count 99230) 280s 99.609% <= 0.983 milliseconds (cumulative count 99610) 280s 99.805% <= 1.039 milliseconds (cumulative count 99810) 280s 99.902% <= 1.079 milliseconds (cumulative count 99910) 280s 99.951% <= 1.111 milliseconds (cumulative count 99960) 280s 99.976% <= 1.159 milliseconds (cumulative count 99980) 280s 99.988% <= 1.207 milliseconds (cumulative count 99990) 280s 99.994% <= 1.343 milliseconds (cumulative count 100000) 280s 100.000% <= 1.343 milliseconds (cumulative count 100000) 280s 280s Cumulative distribution of latencies: 280s 0.000% <= 0.103 milliseconds (cumulative count 0) 280s 2.940% <= 0.303 milliseconds (cumulative count 2940) 280s 18.990% <= 0.407 milliseconds (cumulative count 18990) 280s 42.030% <= 0.503 milliseconds (cumulative count 42030) 280s 69.990% <= 0.607 milliseconds (cumulative count 69990) 280s 85.490% <= 0.703 milliseconds (cumulative count 85490) 280s 95.750% <= 0.807 milliseconds (cumulative count 95750) 280s 98.950% <= 0.903 milliseconds (cumulative count 98950) 280s 99.700% <= 1.007 milliseconds (cumulative count 99700) 280s 99.950% <= 1.103 milliseconds (cumulative count 99950) 280s 99.990% <= 1.207 milliseconds (cumulative count 99990) 280s 100.000% <= 1.407 milliseconds (cumulative count 100000) 280s 280s Summary: 280s throughput summary: 735294.06 requests per second 280s latency summary (msec): 280s avg min p50 p95 p99 max 280s 0.542 0.208 0.535 0.799 0.911 1.343 280s ====== INCR ====== 280s 100000 requests completed in 0.14 seconds 280s 50 parallel clients 280s 3 bytes payload 280s keep alive: 1 280s host configuration "save": 3600 1 300 100 60 10000 280s host configuration "appendonly": no 280s multi-thread: no 280s 280s Latency by percentile distribution: 280s 0.000% <= 0.231 milliseconds (cumulative count 10) 280s 50.000% <= 0.575 milliseconds (cumulative count 50320) 280s 75.000% <= 0.679 milliseconds (cumulative count 75590) 280s 87.500% <= 0.759 milliseconds (cumulative count 87600) 280s 93.750% <= 0.823 milliseconds (cumulative count 94430) 280s 96.875% <= 0.863 milliseconds (cumulative count 96880) 280s 98.438% <= 0.911 milliseconds (cumulative count 98600) 280s 99.219% <= 0.959 milliseconds (cumulative count 99230) 280s 99.609% <= 1.007 milliseconds (cumulative count 99620) 280s 99.805% <= 1.071 milliseconds (cumulative count 99820) 280s 99.902% <= 1.103 milliseconds (cumulative count 99940) 280s 99.951% <= 1.143 milliseconds (cumulative count 99960) 280s 99.976% <= 1.167 milliseconds (cumulative count 99980) 280s 99.988% <= 1.183 milliseconds (cumulative count 99990) 280s 99.994% <= 1.207 milliseconds (cumulative count 100000) 280s 100.000% <= 1.207 milliseconds (cumulative count 100000) 280s 280s Cumulative distribution of latencies: 280s 0.000% <= 0.103 milliseconds (cumulative count 0) 280s 1.560% <= 0.303 milliseconds (cumulative count 1560) 280s 11.250% <= 0.407 milliseconds (cumulative count 11250) 280s 29.330% <= 0.503 milliseconds (cumulative count 29330) 280s 59.850% <= 0.607 milliseconds (cumulative count 59850) 280s 79.570% <= 0.703 milliseconds (cumulative count 79570) 280s 93.030% <= 0.807 milliseconds (cumulative count 93030) 280s 98.410% <= 0.903 milliseconds (cumulative count 98410) 280s 99.620% <= 1.007 milliseconds (cumulative count 99620) 280s 99.940% <= 1.103 milliseconds (cumulative count 99940) 280s 100.000% <= 1.207 milliseconds (cumulative count 100000) 280s 280s Summary: 280s throughput summary: 719424.44 requests per second 280s latency summary (msec): 280s avg min p50 p95 p99 max 280s 0.584 0.224 0.575 0.839 0.935 1.207 280s LPUSH: rps=157840.0 (overall: 597878.8) avg_msec=0.732 (overall: 0.732) ====== LPUSH ====== 280s 100000 requests completed in 0.17 seconds 280s 50 parallel clients 280s 3 bytes payload 280s keep alive: 1 280s host configuration "save": 3600 1 300 100 60 10000 280s host configuration "appendonly": no 280s multi-thread: no 280s 280s Latency by percentile distribution: 280s 0.000% <= 0.271 milliseconds (cumulative count 10) 280s 50.000% <= 0.711 milliseconds (cumulative count 50160) 280s 75.000% <= 0.839 milliseconds (cumulative count 75210) 280s 87.500% <= 0.927 milliseconds (cumulative count 87640) 280s 93.750% <= 0.991 milliseconds (cumulative count 94530) 280s 96.875% <= 1.031 milliseconds (cumulative count 97140) 280s 98.438% <= 1.071 milliseconds (cumulative count 98460) 280s 99.219% <= 1.119 milliseconds (cumulative count 99240) 280s 99.609% <= 1.175 milliseconds (cumulative count 99660) 280s 99.805% <= 1.215 milliseconds (cumulative count 99820) 280s 99.902% <= 1.263 milliseconds (cumulative count 99910) 280s 99.951% <= 1.303 milliseconds (cumulative count 99960) 280s 99.976% <= 1.311 milliseconds (cumulative count 99980) 280s 99.988% <= 1.327 milliseconds (cumulative count 99990) 280s 99.994% <= 1.343 milliseconds (cumulative count 100000) 280s 100.000% <= 1.343 milliseconds (cumulative count 100000) 280s 280s Cumulative distribution of latencies: 280s 0.000% <= 0.103 milliseconds (cumulative count 0) 280s 0.050% <= 0.303 milliseconds (cumulative count 50) 280s 0.460% <= 0.407 milliseconds (cumulative count 460) 280s 2.040% <= 0.503 milliseconds (cumulative count 2040) 280s 18.380% <= 0.607 milliseconds (cumulative count 18380) 280s 48.010% <= 0.703 milliseconds (cumulative count 48010) 280s 69.940% <= 0.807 milliseconds (cumulative count 69940) 280s 84.370% <= 0.903 milliseconds (cumulative count 84370) 280s 95.760% <= 1.007 milliseconds (cumulative count 95760) 280s 99.000% <= 1.103 milliseconds (cumulative count 99000) 280s 99.770% <= 1.207 milliseconds (cumulative count 99770) 280s 99.960% <= 1.303 milliseconds (cumulative count 99960) 280s 100.000% <= 1.407 milliseconds (cumulative count 100000) 280s 280s Summary: 280s throughput summary: 598802.44 requests per second 280s latency summary (msec): 280s avg min p50 p95 p99 max 280s 0.738 0.264 0.711 0.999 1.103 1.343 281s RPUSH: rps=387440.0 (overall: 658911.6) avg_msec=0.665 (overall: 0.665) ====== RPUSH ====== 281s 100000 requests completed in 0.15 seconds 281s 50 parallel clients 281s 3 bytes payload 281s keep alive: 1 281s host configuration "save": 3600 1 300 100 60 10000 281s host configuration "appendonly": no 281s multi-thread: no 281s 281s Latency by percentile distribution: 281s 0.000% <= 0.279 milliseconds (cumulative count 10) 281s 50.000% <= 0.639 milliseconds (cumulative count 50890) 281s 75.000% <= 0.759 milliseconds (cumulative count 75340) 281s 87.500% <= 0.847 milliseconds (cumulative count 88280) 281s 93.750% <= 0.895 milliseconds (cumulative count 94030) 281s 96.875% <= 0.935 milliseconds (cumulative count 96950) 281s 98.438% <= 0.983 milliseconds (cumulative count 98640) 281s 99.219% <= 1.023 milliseconds (cumulative count 99250) 281s 99.609% <= 1.087 milliseconds (cumulative count 99620) 281s 99.805% <= 1.135 milliseconds (cumulative count 99810) 281s 99.902% <= 1.215 milliseconds (cumulative count 99910) 281s 99.951% <= 1.263 milliseconds (cumulative count 99970) 281s 99.976% <= 1.271 milliseconds (cumulative count 99980) 281s 99.988% <= 1.295 milliseconds (cumulative count 99990) 281s 99.994% <= 1.311 milliseconds (cumulative count 100000) 281s 100.000% <= 1.311 milliseconds (cumulative count 100000) 281s 281s Cumulative distribution of latencies: 281s 0.000% <= 0.103 milliseconds (cumulative count 0) 281s 0.120% <= 0.303 milliseconds (cumulative count 120) 281s 1.250% <= 0.407 milliseconds (cumulative count 1250) 281s 7.570% <= 0.503 milliseconds (cumulative count 7570) 281s 39.670% <= 0.607 milliseconds (cumulative count 39670) 281s 65.990% <= 0.703 milliseconds (cumulative count 65990) 281s 82.860% <= 0.807 milliseconds (cumulative count 82860) 281s 94.730% <= 0.903 milliseconds (cumulative count 94730) 281s 99.050% <= 1.007 milliseconds (cumulative count 99050) 281s 99.720% <= 1.103 milliseconds (cumulative count 99720) 281s 99.900% <= 1.207 milliseconds (cumulative count 99900) 281s 99.990% <= 1.303 milliseconds (cumulative count 99990) 281s 100.000% <= 1.407 milliseconds (cumulative count 100000) 281s 281s Summary: 281s throughput summary: 657894.75 requests per second 281s latency summary (msec): 281s avg min p50 p95 p99 max 281s 0.666 0.272 0.639 0.911 1.007 1.311 281s ====== LPOP ====== 281s 100000 requests completed in 0.17 seconds 281s 50 parallel clients 281s 3 bytes payload 281s keep alive: 1 281s host configuration "save": 3600 1 300 100 60 10000 281s host configuration "appendonly": no 281s multi-thread: no 281s 281s Latency by percentile distribution: 281s 0.000% <= 0.247 milliseconds (cumulative count 20) 281s 50.000% <= 0.767 milliseconds (cumulative count 51300) 281s 75.000% <= 0.895 milliseconds (cumulative count 75720) 281s 87.500% <= 0.983 milliseconds (cumulative count 88230) 281s 93.750% <= 1.031 milliseconds (cumulative count 93960) 281s 96.875% <= 1.071 milliseconds (cumulative count 96890) 281s 98.438% <= 1.111 milliseconds (cumulative count 98440) 281s 99.219% <= 1.151 milliseconds (cumulative count 99250) 281s 99.609% <= 1.199 milliseconds (cumulative count 99650) 281s 99.805% <= 1.255 milliseconds (cumulative count 99830) 281s 99.902% <= 1.319 milliseconds (cumulative count 99910) 281s 99.951% <= 1.375 milliseconds (cumulative count 99960) 281s 99.976% <= 1.423 milliseconds (cumulative count 99980) 281s 99.988% <= 1.447 milliseconds (cumulative count 99990) 281s 99.994% <= 1.463 milliseconds (cumulative count 100000) 281s 100.000% <= 1.463 milliseconds (cumulative count 100000) 281s 281s Cumulative distribution of latencies: 281s 0.000% <= 0.103 milliseconds (cumulative count 0) 281s 0.090% <= 0.303 milliseconds (cumulative count 90) 281s 0.460% <= 0.407 milliseconds (cumulative count 460) 281s 1.130% <= 0.503 milliseconds (cumulative count 1130) 281s 8.590% <= 0.607 milliseconds (cumulative count 8590) 281s 35.060% <= 0.703 milliseconds (cumulative count 35060) 281s 60.160% <= 0.807 milliseconds (cumulative count 60160) 281s 76.960% <= 0.903 milliseconds (cumulative count 76960) 281s 91.390% <= 1.007 milliseconds (cumulative count 91390) 281s 98.210% <= 1.103 milliseconds (cumulative count 98210) 281s 99.660% <= 1.207 milliseconds (cumulative count 99660) 281s 99.890% <= 1.303 milliseconds (cumulative count 99890) 281s 99.970% <= 1.407 milliseconds (cumulative count 99970) 281s 100.000% <= 1.503 milliseconds (cumulative count 100000) 281s 281s Summary: 281s throughput summary: 571428.56 requests per second 281s latency summary (msec): 281s avg min p50 p95 p99 max 281s 0.785 0.240 0.767 1.047 1.143 1.463 281s RPOP: rps=157768.9 (overall: 591044.8) avg_msec=0.738 (overall: 0.738) ====== RPOP ====== 281s 100000 requests completed in 0.17 seconds 281s 50 parallel clients 281s 3 bytes payload 281s keep alive: 1 281s host configuration "save": 3600 1 300 100 60 10000 281s host configuration "appendonly": no 281s multi-thread: no 281s 281s Latency by percentile distribution: 281s 0.000% <= 0.239 milliseconds (cumulative count 10) 281s 50.000% <= 0.719 milliseconds (cumulative count 50530) 281s 75.000% <= 0.855 milliseconds (cumulative count 76070) 281s 87.500% <= 0.943 milliseconds (cumulative count 88460) 281s 93.750% <= 0.991 milliseconds (cumulative count 94060) 281s 96.875% <= 1.039 milliseconds (cumulative count 97110) 281s 98.438% <= 1.095 milliseconds (cumulative count 98630) 281s 99.219% <= 1.143 milliseconds (cumulative count 99250) 281s 99.609% <= 1.199 milliseconds (cumulative count 99620) 281s 99.805% <= 1.263 milliseconds (cumulative count 99820) 281s 99.902% <= 1.319 milliseconds (cumulative count 99920) 281s 99.951% <= 1.351 milliseconds (cumulative count 99960) 281s 99.976% <= 1.367 milliseconds (cumulative count 99980) 281s 99.988% <= 1.383 milliseconds (cumulative count 99990) 281s 99.994% <= 1.399 milliseconds (cumulative count 100000) 281s 100.000% <= 1.399 milliseconds (cumulative count 100000) 281s 281s Cumulative distribution of latencies: 281s 0.000% <= 0.103 milliseconds (cumulative count 0) 281s 0.120% <= 0.303 milliseconds (cumulative count 120) 281s 0.570% <= 0.407 milliseconds (cumulative count 570) 281s 1.770% <= 0.503 milliseconds (cumulative count 1770) 281s 16.380% <= 0.607 milliseconds (cumulative count 16380) 281s 46.570% <= 0.703 milliseconds (cumulative count 46570) 281s 68.500% <= 0.807 milliseconds (cumulative count 68500) 281s 83.120% <= 0.903 milliseconds (cumulative count 83120) 281s 95.290% <= 1.007 milliseconds (cumulative count 95290) 281s 98.740% <= 1.103 milliseconds (cumulative count 98740) 281s 99.670% <= 1.207 milliseconds (cumulative count 99670) 281s 99.890% <= 1.303 milliseconds (cumulative count 99890) 281s 100.000% <= 1.407 milliseconds (cumulative count 100000) 281s 281s Summary: 281s throughput summary: 595238.12 requests per second 281s latency summary (msec): 281s avg min p50 p95 p99 max 281s 0.746 0.232 0.719 1.007 1.119 1.399 281s ====== SADD ====== 281s 100000 requests completed in 0.14 seconds 281s 50 parallel clients 281s 3 bytes payload 281s keep alive: 1 281s host configuration "save": 3600 1 300 100 60 10000 281s host configuration "appendonly": no 281s multi-thread: no 281s 281s Latency by percentile distribution: 281s 0.000% <= 0.239 milliseconds (cumulative count 40) 281s 50.000% <= 0.567 milliseconds (cumulative count 52180) 281s 75.000% <= 0.663 milliseconds (cumulative count 76000) 281s 87.500% <= 0.743 milliseconds (cumulative count 87790) 281s 93.750% <= 0.791 milliseconds (cumulative count 93960) 281s 96.875% <= 0.831 milliseconds (cumulative count 97150) 281s 98.438% <= 0.871 milliseconds (cumulative count 98480) 281s 99.219% <= 0.911 milliseconds (cumulative count 99240) 281s 99.609% <= 0.983 milliseconds (cumulative count 99620) 281s 99.805% <= 1.047 milliseconds (cumulative count 99810) 281s 99.902% <= 1.103 milliseconds (cumulative count 99910) 281s 99.951% <= 1.215 milliseconds (cumulative count 99960) 281s 99.976% <= 1.255 milliseconds (cumulative count 99980) 281s 99.988% <= 1.271 milliseconds (cumulative count 99990) 281s 99.994% <= 1.287 milliseconds (cumulative count 100000) 281s 100.000% <= 1.287 milliseconds (cumulative count 100000) 281s 281s Cumulative distribution of latencies: 281s 0.000% <= 0.103 milliseconds (cumulative count 0) 281s 1.260% <= 0.303 milliseconds (cumulative count 1260) 281s 6.160% <= 0.407 milliseconds (cumulative count 6160) 281s 25.860% <= 0.503 milliseconds (cumulative count 25860) 281s 64.890% <= 0.607 milliseconds (cumulative count 64890) 281s 82.140% <= 0.703 milliseconds (cumulative count 82140) 281s 95.500% <= 0.807 milliseconds (cumulative count 95500) 281s 99.100% <= 0.903 milliseconds (cumulative count 99100) 281s 99.710% <= 1.007 milliseconds (cumulative count 99710) 281s 99.910% <= 1.103 milliseconds (cumulative count 99910) 281s 99.950% <= 1.207 milliseconds (cumulative count 99950) 281s 100.000% <= 1.303 milliseconds (cumulative count 100000) 281s 281s Summary: 281s throughput summary: 740740.69 requests per second 281s latency summary (msec): 281s avg min p50 p95 p99 max 281s 0.581 0.232 0.567 0.807 0.903 1.287 281s HSET: rps=22000.0 (overall: 550000.0) avg_msec=0.706 (overall: 0.706) ====== HSET ====== 281s 100000 requests completed in 0.16 seconds 281s 50 parallel clients 281s 3 bytes payload 281s keep alive: 1 281s host configuration "save": 3600 1 300 100 60 10000 281s host configuration "appendonly": no 281s multi-thread: no 281s 281s Latency by percentile distribution: 281s 0.000% <= 0.255 milliseconds (cumulative count 20) 281s 50.000% <= 0.671 milliseconds (cumulative count 51460) 281s 75.000% <= 0.791 milliseconds (cumulative count 75230) 281s 87.500% <= 0.879 milliseconds (cumulative count 87630) 281s 93.750% <= 0.927 milliseconds (cumulative count 94040) 281s 96.875% <= 0.967 milliseconds (cumulative count 97240) 281s 98.438% <= 1.007 milliseconds (cumulative count 98550) 281s 99.219% <= 1.047 milliseconds (cumulative count 99310) 281s 99.609% <= 1.079 milliseconds (cumulative count 99630) 281s 99.805% <= 1.119 milliseconds (cumulative count 99810) 281s 99.902% <= 1.151 milliseconds (cumulative count 99910) 281s 99.951% <= 1.199 milliseconds (cumulative count 99960) 281s 99.976% <= 1.247 milliseconds (cumulative count 99980) 281s 99.988% <= 1.263 milliseconds (cumulative count 99990) 281s 99.994% <= 1.279 milliseconds (cumulative count 100000) 281s 100.000% <= 1.279 milliseconds (cumulative count 100000) 281s 281s Cumulative distribution of latencies: 281s 0.000% <= 0.103 milliseconds (cumulative count 0) 281s 0.110% <= 0.303 milliseconds (cumulative count 110) 281s 1.340% <= 0.407 milliseconds (cumulative count 1340) 281s 4.360% <= 0.503 milliseconds (cumulative count 4360) 281s 29.920% <= 0.607 milliseconds (cumulative count 29920) 281s 58.730% <= 0.703 milliseconds (cumulative count 58730) 281s 77.680% <= 0.807 milliseconds (cumulative count 77680) 281s 90.990% <= 0.903 milliseconds (cumulative count 90990) 281s 98.550% <= 1.007 milliseconds (cumulative count 98550) 281s 99.750% <= 1.103 milliseconds (cumulative count 99750) 281s 99.960% <= 1.207 milliseconds (cumulative count 99960) 281s 100.000% <= 1.303 milliseconds (cumulative count 100000) 281s 281s Summary: 281s throughput summary: 632911.38 requests per second 281s latency summary (msec): 281s avg min p50 p95 p99 max 281s 0.695 0.248 0.671 0.943 1.031 1.279 281s SPOP: rps=308720.0 (overall: 771800.0) avg_msec=0.422 (overall: 0.422) ====== SPOP ====== 281s 100000 requests completed in 0.13 seconds 281s 50 parallel clients 281s 3 bytes payload 281s keep alive: 1 281s host configuration "save": 3600 1 300 100 60 10000 281s host configuration "appendonly": no 281s multi-thread: no 281s 281s Latency by percentile distribution: 281s 0.000% <= 0.143 milliseconds (cumulative count 10) 281s 50.000% <= 0.399 milliseconds (cumulative count 52290) 281s 75.000% <= 0.463 milliseconds (cumulative count 75350) 281s 87.500% <= 0.527 milliseconds (cumulative count 88480) 281s 93.750% <= 0.583 milliseconds (cumulative count 93800) 281s 96.875% <= 0.655 milliseconds (cumulative count 97080) 281s 98.438% <= 0.735 milliseconds (cumulative count 98510) 281s 99.219% <= 0.967 milliseconds (cumulative count 99230) 281s 99.609% <= 1.191 milliseconds (cumulative count 99610) 281s 99.805% <= 1.327 milliseconds (cumulative count 99810) 281s 99.902% <= 1.471 milliseconds (cumulative count 99910) 281s 99.951% <= 1.791 milliseconds (cumulative count 99960) 281s 99.976% <= 1.847 milliseconds (cumulative count 99980) 281s 99.988% <= 1.903 milliseconds (cumulative count 99990) 281s 99.994% <= 1.991 milliseconds (cumulative count 100000) 281s 100.000% <= 1.991 milliseconds (cumulative count 100000) 281s 281s Cumulative distribution of latencies: 281s 0.000% <= 0.103 milliseconds (cumulative count 0) 281s 0.080% <= 0.207 milliseconds (cumulative count 80) 281s 10.500% <= 0.303 milliseconds (cumulative count 10500) 281s 55.510% <= 0.407 milliseconds (cumulative count 55510) 281s 84.750% <= 0.503 milliseconds (cumulative count 84750) 281s 95.150% <= 0.607 milliseconds (cumulative count 95150) 281s 98.060% <= 0.703 milliseconds (cumulative count 98060) 281s 98.880% <= 0.807 milliseconds (cumulative count 98880) 281s 99.080% <= 0.903 milliseconds (cumulative count 99080) 281s 99.300% <= 1.007 milliseconds (cumulative count 99300) 281s 99.460% <= 1.103 milliseconds (cumulative count 99460) 281s 99.650% <= 1.207 milliseconds (cumulative count 99650) 281s 99.790% <= 1.303 milliseconds (cumulative count 99790) 281s 99.870% <= 1.407 milliseconds (cumulative count 99870) 281s 99.910% <= 1.503 milliseconds (cumulative count 99910) 281s 99.920% <= 1.703 milliseconds (cumulative count 99920) 281s 99.960% <= 1.807 milliseconds (cumulative count 99960) 281s 99.990% <= 1.903 milliseconds (cumulative count 99990) 281s 100.000% <= 2.007 milliseconds (cumulative count 100000) 281s 281s Summary: 281s throughput summary: 775193.81 requests per second 281s latency summary (msec): 281s avg min p50 p95 p99 max 281s 0.416 0.136 0.399 0.607 0.863 1.991 281s ====== ZADD ====== 281s 100000 requests completed in 0.17 seconds 281s 50 parallel clients 281s 3 bytes payload 281s keep alive: 1 281s host configuration "save": 3600 1 300 100 60 10000 281s host configuration "appendonly": no 281s multi-thread: no 281s 281s Latency by percentile distribution: 281s 0.000% <= 0.263 milliseconds (cumulative count 10) 281s 50.000% <= 0.735 milliseconds (cumulative count 50040) 281s 75.000% <= 0.863 milliseconds (cumulative count 75030) 281s 87.500% <= 0.959 milliseconds (cumulative count 88480) 281s 93.750% <= 1.007 milliseconds (cumulative count 94380) 281s 96.875% <= 1.039 milliseconds (cumulative count 96950) 281s 98.438% <= 1.079 milliseconds (cumulative count 98470) 281s 99.219% <= 1.127 milliseconds (cumulative count 99340) 281s 99.609% <= 1.159 milliseconds (cumulative count 99650) 281s 99.805% <= 1.191 milliseconds (cumulative count 99810) 281s 99.902% <= 1.247 milliseconds (cumulative count 99910) 281s 99.951% <= 1.319 milliseconds (cumulative count 99960) 281s 99.976% <= 1.335 milliseconds (cumulative count 99980) 281s 99.988% <= 1.343 milliseconds (cumulative count 99990) 281s 99.994% <= 1.367 milliseconds (cumulative count 100000) 281s 100.000% <= 1.367 milliseconds (cumulative count 100000) 281s 281s Cumulative distribution of latencies: 281s 0.000% <= 0.103 milliseconds (cumulative count 0) 281s 0.030% <= 0.303 milliseconds (cumulative count 30) 281s 0.410% <= 0.407 milliseconds (cumulative count 410) 281s 1.420% <= 0.503 milliseconds (cumulative count 1420) 281s 9.550% <= 0.607 milliseconds (cumulative count 9550) 281s 41.360% <= 0.703 milliseconds (cumulative count 41360) 281s 65.410% <= 0.807 milliseconds (cumulative count 65410) 281s 80.780% <= 0.903 milliseconds (cumulative count 80780) 281s 94.380% <= 1.007 milliseconds (cumulative count 94380) 281s 98.880% <= 1.103 milliseconds (cumulative count 98880) 281s 99.830% <= 1.207 milliseconds (cumulative count 99830) 281s 99.940% <= 1.303 milliseconds (cumulative count 99940) 281s 100.000% <= 1.407 milliseconds (cumulative count 100000) 281s 281s Summary: 281s throughput summary: 581395.31 requests per second 281s latency summary (msec): 281s avg min p50 p95 p99 max 281s 0.764 0.256 0.735 1.015 1.111 1.367 282s ZPOPMIN: rps=138008.0 (overall: 753043.5) avg_msec=0.431 (overall: 0.431) ====== ZPOPMIN ====== 282s 100000 requests completed in 0.13 seconds 282s 50 parallel clients 282s 3 bytes payload 282s keep alive: 1 282s host configuration "save": 3600 1 300 100 60 10000 282s host configuration "appendonly": no 282s multi-thread: no 282s 282s Latency by percentile distribution: 282s 0.000% <= 0.143 milliseconds (cumulative count 10) 282s 50.000% <= 0.391 milliseconds (cumulative count 50080) 282s 75.000% <= 0.463 milliseconds (cumulative count 76560) 282s 87.500% <= 0.519 milliseconds (cumulative count 88570) 282s 93.750% <= 0.567 milliseconds (cumulative count 94050) 282s 96.875% <= 0.615 milliseconds (cumulative count 97060) 282s 98.438% <= 0.663 milliseconds (cumulative count 98620) 282s 99.219% <= 0.695 milliseconds (cumulative count 99280) 282s 99.609% <= 0.719 milliseconds (cumulative count 99610) 282s 99.805% <= 0.743 milliseconds (cumulative count 99820) 282s 99.902% <= 0.759 milliseconds (cumulative count 99910) 282s 99.951% <= 0.775 milliseconds (cumulative count 99960) 282s 99.976% <= 0.807 milliseconds (cumulative count 99980) 282s 99.988% <= 0.815 milliseconds (cumulative count 100000) 282s 100.000% <= 0.815 milliseconds (cumulative count 100000) 282s 282s Cumulative distribution of latencies: 282s 0.000% <= 0.103 milliseconds (cumulative count 0) 282s 0.050% <= 0.207 milliseconds (cumulative count 50) 282s 9.610% <= 0.303 milliseconds (cumulative count 9610) 282s 56.930% <= 0.407 milliseconds (cumulative count 56930) 282s 85.930% <= 0.503 milliseconds (cumulative count 85930) 282s 96.710% <= 0.607 milliseconds (cumulative count 96710) 282s 99.410% <= 0.703 milliseconds (cumulative count 99410) 282s 99.980% <= 0.807 milliseconds (cumulative count 99980) 282s 100.000% <= 0.903 milliseconds (cumulative count 100000) 282s 282s Summary: 282s throughput summary: 787401.56 requests per second 282s latency summary (msec): 282s avg min p50 p95 p99 max 282s 0.407 0.136 0.391 0.583 0.687 0.815 282s LPUSH (needed to benchmark LRANGE): rps=392520.0 (overall: 587604.8) avg_msec=0.750 (overall: 0.750) ====== LPUSH (needed to benchmark LRANGE) ====== 282s 100000 requests completed in 0.17 seconds 282s 50 parallel clients 282s 3 bytes payload 282s keep alive: 1 282s host configuration "save": 3600 1 300 100 60 10000 282s host configuration "appendonly": no 282s multi-thread: no 282s 282s Latency by percentile distribution: 282s 0.000% <= 0.263 milliseconds (cumulative count 20) 282s 50.000% <= 0.727 milliseconds (cumulative count 51810) 282s 75.000% <= 0.855 milliseconds (cumulative count 75390) 282s 87.500% <= 0.943 milliseconds (cumulative count 87520) 282s 93.750% <= 1.007 milliseconds (cumulative count 94520) 282s 96.875% <= 1.047 milliseconds (cumulative count 96990) 282s 98.438% <= 1.087 milliseconds (cumulative count 98450) 282s 99.219% <= 1.135 milliseconds (cumulative count 99320) 282s 99.609% <= 1.183 milliseconds (cumulative count 99660) 282s 99.805% <= 1.215 milliseconds (cumulative count 99810) 282s 99.902% <= 1.255 milliseconds (cumulative count 99910) 282s 99.951% <= 1.319 milliseconds (cumulative count 99960) 282s 99.976% <= 1.359 milliseconds (cumulative count 99980) 282s 99.988% <= 1.391 milliseconds (cumulative count 99990) 282s 99.994% <= 1.431 milliseconds (cumulative count 100000) 282s 100.000% <= 1.431 milliseconds (cumulative count 100000) 282s 282s Cumulative distribution of latencies: 282s 0.000% <= 0.103 milliseconds (cumulative count 0) 282s 0.070% <= 0.303 milliseconds (cumulative count 70) 282s 0.610% <= 0.407 milliseconds (cumulative count 610) 282s 2.050% <= 0.503 milliseconds (cumulative count 2050) 282s 14.750% <= 0.607 milliseconds (cumulative count 14750) 282s 45.260% <= 0.703 milliseconds (cumulative count 45260) 282s 67.700% <= 0.807 milliseconds (cumulative count 67700) 282s 82.170% <= 0.903 milliseconds (cumulative count 82170) 282s 94.520% <= 1.007 milliseconds (cumulative count 94520) 282s 98.830% <= 1.103 milliseconds (cumulative count 98830) 282s 99.800% <= 1.207 milliseconds (cumulative count 99800) 282s 99.950% <= 1.303 milliseconds (cumulative count 99950) 282s 99.990% <= 1.407 milliseconds (cumulative count 99990) 282s 100.000% <= 1.503 milliseconds (cumulative count 100000) 282s 282s Summary: 282s throughput summary: 588235.31 requests per second 282s latency summary (msec): 282s avg min p50 p95 p99 max 282s 0.750 0.256 0.727 1.015 1.119 1.431 283s LRANGE_100 (first 100 elements): rps=109761.0 (overall: 111991.9) avg_msec=3.519 (overall: 3.519) LRANGE_100 (first 100 elements): rps=113960.0 (overall: 112983.9) avg_msec=3.510 (overall: 3.514) LRANGE_100 (first 100 elements): rps=96160.0 (overall: 107345.8) avg_msec=3.864 (overall: 3.619) ====== LRANGE_100 (first 100 elements) ====== 283s 100000 requests completed in 0.92 seconds 283s 50 parallel clients 283s 3 bytes payload 283s keep alive: 1 283s host configuration "save": 3600 1 300 100 60 10000 283s host configuration "appendonly": no 283s multi-thread: no 283s 283s Latency by percentile distribution: 283s 0.000% <= 0.463 milliseconds (cumulative count 10) 283s 50.000% <= 3.415 milliseconds (cumulative count 50280) 283s 75.000% <= 3.975 milliseconds (cumulative count 75170) 283s 87.500% <= 4.487 milliseconds (cumulative count 87520) 283s 93.750% <= 5.311 milliseconds (cumulative count 93820) 283s 96.875% <= 5.871 milliseconds (cumulative count 96880) 283s 98.438% <= 6.623 milliseconds (cumulative count 98450) 283s 99.219% <= 7.551 milliseconds (cumulative count 99220) 283s 99.609% <= 14.799 milliseconds (cumulative count 99610) 283s 99.805% <= 15.559 milliseconds (cumulative count 99810) 283s 99.902% <= 16.167 milliseconds (cumulative count 99910) 283s 99.951% <= 17.503 milliseconds (cumulative count 99960) 283s 99.976% <= 18.703 milliseconds (cumulative count 99980) 283s 99.988% <= 19.215 milliseconds (cumulative count 99990) 283s 99.994% <= 19.775 milliseconds (cumulative count 100000) 283s 100.000% <= 19.775 milliseconds (cumulative count 100000) 283s 283s Cumulative distribution of latencies: 283s 0.000% <= 0.103 milliseconds (cumulative count 0) 283s 0.010% <= 0.503 milliseconds (cumulative count 10) 283s 0.050% <= 0.607 milliseconds (cumulative count 50) 283s 0.060% <= 0.703 milliseconds (cumulative count 60) 283s 0.080% <= 0.807 milliseconds (cumulative count 80) 283s 0.090% <= 0.903 milliseconds (cumulative count 90) 283s 0.110% <= 1.103 milliseconds (cumulative count 110) 283s 0.140% <= 1.207 milliseconds (cumulative count 140) 283s 0.180% <= 1.303 milliseconds (cumulative count 180) 283s 0.230% <= 1.407 milliseconds (cumulative count 230) 283s 0.300% <= 1.503 milliseconds (cumulative count 300) 283s 0.440% <= 1.607 milliseconds (cumulative count 440) 283s 0.700% <= 1.703 milliseconds (cumulative count 700) 283s 1.110% <= 1.807 milliseconds (cumulative count 1110) 283s 1.720% <= 1.903 milliseconds (cumulative count 1720) 283s 2.580% <= 2.007 milliseconds (cumulative count 2580) 283s 3.310% <= 2.103 milliseconds (cumulative count 3310) 283s 35.630% <= 3.103 milliseconds (cumulative count 35630) 283s 80.500% <= 4.103 milliseconds (cumulative count 80500) 283s 92.430% <= 5.103 milliseconds (cumulative count 92430) 283s 97.460% <= 6.103 milliseconds (cumulative count 97460) 283s 98.960% <= 7.103 milliseconds (cumulative count 98960) 283s 99.390% <= 8.103 milliseconds (cumulative count 99390) 283s 99.470% <= 9.103 milliseconds (cumulative count 99470) 283s 99.480% <= 10.103 milliseconds (cumulative count 99480) 283s 99.520% <= 14.103 milliseconds (cumulative count 99520) 283s 99.690% <= 15.103 milliseconds (cumulative count 99690) 283s 99.900% <= 16.103 milliseconds (cumulative count 99900) 283s 99.950% <= 17.103 milliseconds (cumulative count 99950) 283s 99.960% <= 18.111 milliseconds (cumulative count 99960) 283s 99.980% <= 19.103 milliseconds (cumulative count 99980) 283s 100.000% <= 20.111 milliseconds (cumulative count 100000) 283s 283s Summary: 283s throughput summary: 108225.10 requests per second 283s latency summary (msec): 283s avg min p50 p95 p99 max 283s 3.586 0.456 3.415 5.447 7.159 19.775 286s LRANGE_300 (first 300 elements): rps=6921.9 (overall: 23315.8) avg_msec=13.516 (overall: 13.516) LRANGE_300 (first 300 elements): rps=33454.5 (overall: 31112.5) avg_msec=7.636 (overall: 8.654) LRANGE_300 (first 300 elements): rps=33031.5 (overall: 31948.5) avg_msec=8.027 (overall: 8.371) LRANGE_300 (first 300 elements): rps=33627.0 (overall: 32455.1) avg_msec=7.410 (overall: 8.071) LRANGE_300 (first 300 elements): rps=33585.7 (overall: 32716.4) avg_msec=7.583 (overall: 7.955) LRANGE_300 (first 300 elements): rps=33496.0 (overall: 32862.3) avg_msec=7.816 (overall: 7.929) LRANGE_300 (first 300 elements): rps=33450.2 (overall: 32955.3) avg_msec=7.868 (overall: 7.919) LRANGE_300 (first 300 elements): rps=32972.3 (overall: 32957.6) avg_msec=8.176 (overall: 7.954) LRANGE_300 (first 300 elements): rps=32740.2 (overall: 32931.2) avg_msec=8.275 (overall: 7.993) LRANGE_300 (first 300 elements): rps=33503.9 (overall: 32993.2) avg_msec=7.351 (overall: 7.922) LRANGE_300 (first 300 elements): rps=33571.4 (overall: 33049.2) avg_msec=7.509 (overall: 7.882) LRANGE_300 (first 300 elements): rps=33402.4 (overall: 33080.3) avg_msec=7.803 (overall: 7.875) ====== LRANGE_300 (first 300 elements) ====== 286s 100000 requests completed in 3.02 seconds 286s 50 parallel clients 286s 3 bytes payload 286s keep alive: 1 286s host configuration "save": 3600 1 300 100 60 10000 286s host configuration "appendonly": no 286s multi-thread: no 286s 286s Latency by percentile distribution: 286s 0.000% <= 1.023 milliseconds (cumulative count 10) 286s 50.000% <= 7.503 milliseconds (cumulative count 50120) 286s 75.000% <= 8.775 milliseconds (cumulative count 75100) 286s 87.500% <= 10.055 milliseconds (cumulative count 87520) 286s 93.750% <= 11.367 milliseconds (cumulative count 93760) 286s 96.875% <= 12.679 milliseconds (cumulative count 96890) 286s 98.438% <= 15.127 milliseconds (cumulative count 98440) 286s 99.219% <= 17.183 milliseconds (cumulative count 99220) 286s 99.609% <= 19.119 milliseconds (cumulative count 99610) 286s 99.805% <= 20.319 milliseconds (cumulative count 99810) 286s 99.902% <= 21.199 milliseconds (cumulative count 99910) 286s 99.951% <= 21.711 milliseconds (cumulative count 99970) 286s 99.976% <= 21.903 milliseconds (cumulative count 99980) 286s 99.988% <= 21.919 milliseconds (cumulative count 99990) 286s 99.994% <= 22.095 milliseconds (cumulative count 100000) 286s 100.000% <= 22.095 milliseconds (cumulative count 100000) 286s 286s Cumulative distribution of latencies: 286s 0.000% <= 0.103 milliseconds (cumulative count 0) 286s 0.020% <= 1.103 milliseconds (cumulative count 20) 286s 0.030% <= 1.207 milliseconds (cumulative count 30) 286s 0.040% <= 1.303 milliseconds (cumulative count 40) 286s 0.060% <= 1.407 milliseconds (cumulative count 60) 286s 0.080% <= 1.607 milliseconds (cumulative count 80) 286s 0.090% <= 1.903 milliseconds (cumulative count 90) 286s 0.100% <= 2.103 milliseconds (cumulative count 100) 286s 0.220% <= 3.103 milliseconds (cumulative count 220) 286s 1.270% <= 4.103 milliseconds (cumulative count 1270) 286s 5.310% <= 5.103 milliseconds (cumulative count 5310) 286s 17.440% <= 6.103 milliseconds (cumulative count 17440) 286s 40.270% <= 7.103 milliseconds (cumulative count 40270) 286s 63.550% <= 8.103 milliseconds (cumulative count 63550) 286s 79.490% <= 9.103 milliseconds (cumulative count 79490) 286s 87.740% <= 10.103 milliseconds (cumulative count 87740) 286s 92.700% <= 11.103 milliseconds (cumulative count 92700) 286s 95.940% <= 12.103 milliseconds (cumulative count 95940) 286s 97.360% <= 13.103 milliseconds (cumulative count 97360) 286s 98.080% <= 14.103 milliseconds (cumulative count 98080) 286s 98.430% <= 15.103 milliseconds (cumulative count 98430) 286s 98.780% <= 16.103 milliseconds (cumulative count 98780) 286s 99.180% <= 17.103 milliseconds (cumulative count 99180) 286s 99.440% <= 18.111 milliseconds (cumulative count 99440) 286s 99.590% <= 19.103 milliseconds (cumulative count 99590) 286s 99.770% <= 20.111 milliseconds (cumulative count 99770) 286s 99.900% <= 21.103 milliseconds (cumulative count 99900) 286s 100.000% <= 22.111 milliseconds (cumulative count 100000) 286s 286s Summary: 286s throughput summary: 33112.58 requests per second 286s latency summary (msec): 286s avg min p50 p95 p99 max 286s 7.857 1.016 7.503 11.759 16.687 22.095 292s LRANGE_500 (first 500 elements): rps=3824.9 (overall: 11564.7) avg_msec=23.270 (overall: 23.270) LRANGE_500 (first 500 elements): rps=13976.2 (overall: 13368.0) avg_msec=19.872 (overall: 20.614) LRANGE_500 (first 500 elements): rps=17464.6 (overall: 15128.6) avg_msec=14.427 (overall: 17.544) LRANGE_500 (first 500 elements): rps=15572.5 (overall: 15262.4) avg_msec=17.036 (overall: 17.388) LRANGE_500 (first 500 elements): rps=16856.6 (overall: 15627.2) avg_msec=16.387 (overall: 17.141) LRANGE_500 (first 500 elements): rps=18111.6 (overall: 16089.8) avg_msec=14.786 (overall: 16.647) LRANGE_500 (first 500 elements): rps=18166.7 (overall: 16416.9) avg_msec=13.832 (overall: 16.157) LRANGE_500 (first 500 elements): rps=19738.3 (overall: 16875.0) avg_msec=10.649 (overall: 15.268) LRANGE_500 (first 500 elements): rps=19738.1 (overall: 17217.3) avg_msec=10.321 (overall: 14.590) LRANGE_500 (first 500 elements): rps=19051.6 (overall: 17413.1) avg_msec=11.953 (overall: 14.282) LRANGE_500 (first 500 elements): rps=16781.7 (overall: 17352.2) avg_msec=16.319 (overall: 14.472) LRANGE_500 (first 500 elements): rps=15865.1 (overall: 17221.4) avg_msec=17.619 (overall: 14.727) LRANGE_500 (first 500 elements): rps=16797.6 (overall: 17187.1) avg_msec=16.648 (overall: 14.879) LRANGE_500 (first 500 elements): rps=15797.6 (overall: 17083.1) avg_msec=17.717 (overall: 15.075) LRANGE_500 (first 500 elements): rps=15174.4 (overall: 16947.3) avg_msec=18.704 (overall: 15.307) LRANGE_500 (first 500 elements): rps=15259.8 (overall: 16836.9) avg_msec=18.713 (overall: 15.509) LRANGE_500 (first 500 elements): rps=19575.4 (overall: 17003.9) avg_msec=12.271 (overall: 15.281) LRANGE_500 (first 500 elements): rps=18980.2 (overall: 17117.9) avg_msec=12.823 (overall: 15.124) LRANGE_500 (first 500 elements): rps=17072.0 (overall: 17115.4) avg_msec=16.901 (overall: 15.220) LRANGE_500 (first 500 elements): rps=17727.3 (overall: 17147.1) avg_msec=14.955 (overall: 15.205) LRANGE_500 (first 500 elements): rps=19916.3 (overall: 17282.4) avg_msec=10.770 (overall: 14.956) LRANGE_500 (first 500 elements): rps=19944.7 (overall: 17407.3) avg_msec=10.217 (overall: 14.701) LRANGE_500 (first 500 elements): rps=19884.5 (overall: 17517.5) avg_msec=10.837 (overall: 14.506) ====== LRANGE_500 (first 500 elements) ====== 292s 100000 requests completed in 5.70 seconds 292s 50 parallel clients 292s 3 bytes payload 292s keep alive: 1 292s host configuration "save": 3600 1 300 100 60 10000 292s host configuration "appendonly": no 292s multi-thread: no 292s 292s Latency by percentile distribution: 292s 0.000% <= 0.551 milliseconds (cumulative count 10) 292s 50.000% <= 12.391 milliseconds (cumulative count 50040) 292s 75.000% <= 18.991 milliseconds (cumulative count 75060) 292s 87.500% <= 22.927 milliseconds (cumulative count 87560) 292s 93.750% <= 25.359 milliseconds (cumulative count 93750) 292s 96.875% <= 27.103 milliseconds (cumulative count 96880) 292s 98.438% <= 28.415 milliseconds (cumulative count 98440) 292s 99.219% <= 29.455 milliseconds (cumulative count 99220) 292s 99.609% <= 30.895 milliseconds (cumulative count 99610) 292s 99.805% <= 32.399 milliseconds (cumulative count 99810) 292s 99.902% <= 32.831 milliseconds (cumulative count 99910) 292s 99.951% <= 33.343 milliseconds (cumulative count 99960) 292s 99.976% <= 33.471 milliseconds (cumulative count 99980) 292s 99.988% <= 34.559 milliseconds (cumulative count 99990) 292s 99.994% <= 34.751 milliseconds (cumulative count 100000) 292s 100.000% <= 34.751 milliseconds (cumulative count 100000) 292s 292s Cumulative distribution of latencies: 292s 0.000% <= 0.103 milliseconds (cumulative count 0) 292s 0.010% <= 0.607 milliseconds (cumulative count 10) 292s 0.030% <= 1.303 milliseconds (cumulative count 30) 292s 0.050% <= 1.503 milliseconds (cumulative count 50) 292s 0.070% <= 1.607 milliseconds (cumulative count 70) 292s 0.080% <= 1.807 milliseconds (cumulative count 80) 292s 0.120% <= 1.903 milliseconds (cumulative count 120) 292s 0.130% <= 2.007 milliseconds (cumulative count 130) 292s 0.160% <= 2.103 milliseconds (cumulative count 160) 292s 0.600% <= 3.103 milliseconds (cumulative count 600) 292s 1.550% <= 4.103 milliseconds (cumulative count 1550) 292s 2.560% <= 5.103 milliseconds (cumulative count 2560) 292s 3.870% <= 6.103 milliseconds (cumulative count 3870) 292s 7.930% <= 7.103 milliseconds (cumulative count 7930) 292s 12.810% <= 8.103 milliseconds (cumulative count 12810) 292s 17.550% <= 9.103 milliseconds (cumulative count 17550) 292s 24.860% <= 10.103 milliseconds (cumulative count 24860) 292s 36.350% <= 11.103 milliseconds (cumulative count 36350) 292s 47.810% <= 12.103 milliseconds (cumulative count 47810) 292s 54.040% <= 13.103 milliseconds (cumulative count 54040) 292s 57.680% <= 14.103 milliseconds (cumulative count 57680) 292s 60.820% <= 15.103 milliseconds (cumulative count 60820) 292s 64.350% <= 16.103 milliseconds (cumulative count 64350) 292s 67.730% <= 17.103 milliseconds (cumulative count 67730) 292s 71.720% <= 18.111 milliseconds (cumulative count 71720) 292s 75.400% <= 19.103 milliseconds (cumulative count 75400) 292s 78.790% <= 20.111 milliseconds (cumulative count 78790) 292s 82.220% <= 21.103 milliseconds (cumulative count 82220) 292s 85.250% <= 22.111 milliseconds (cumulative count 85250) 292s 88.070% <= 23.103 milliseconds (cumulative count 88070) 292s 90.590% <= 24.111 milliseconds (cumulative count 90590) 292s 93.110% <= 25.103 milliseconds (cumulative count 93110) 292s 95.480% <= 26.111 milliseconds (cumulative count 95480) 292s 96.880% <= 27.103 milliseconds (cumulative count 96880) 292s 98.160% <= 28.111 milliseconds (cumulative count 98160) 292s 98.930% <= 29.103 milliseconds (cumulative count 98930) 292s 99.460% <= 30.111 milliseconds (cumulative count 99460) 292s 99.650% <= 31.103 milliseconds (cumulative count 99650) 292s 99.750% <= 32.111 milliseconds (cumulative count 99750) 292s 99.940% <= 33.119 milliseconds (cumulative count 99940) 292s 99.980% <= 34.111 milliseconds (cumulative count 99980) 292s 100.000% <= 35.103 milliseconds (cumulative count 100000) 292s 292s Summary: 292s throughput summary: 17540.78 requests per second 292s latency summary (msec): 292s avg min p50 p95 p99 max 292s 14.467 0.544 12.391 25.903 29.215 34.751 299s LRANGE_600 (first 600 elements): rps=7893.7 (overall: 10335.1) avg_msec=25.227 (overall: 25.227) LRANGE_600 (first 600 elements): rps=12936.8 (overall: 11807.6) avg_msec=20.855 (overall: 22.516) LRANGE_600 (first 600 elements): rps=13851.0 (overall: 12549.9) avg_msec=17.143 (overall: 20.362) LRANGE_600 (first 600 elements): rps=13031.4 (overall: 12678.2) avg_msec=21.523 (overall: 20.680) LRANGE_600 (first 600 elements): rps=15311.0 (overall: 13230.4) avg_msec=17.470 (overall: 19.901) LRANGE_600 (first 600 elements): rps=16549.8 (overall: 13800.3) avg_msec=12.258 (overall: 18.327) LRANGE_600 (first 600 elements): rps=15157.5 (overall: 14001.2) avg_msec=14.714 (overall: 17.748) LRANGE_600 (first 600 elements): rps=14736.0 (overall: 14094.6) avg_msec=17.777 (overall: 17.752) LRANGE_600 (first 600 elements): rps=14464.3 (overall: 14136.6) avg_msec=18.506 (overall: 17.840) LRANGE_600 (first 600 elements): rps=10736.4 (overall: 13782.3) avg_msec=25.306 (overall: 18.446) LRANGE_600 (first 600 elements): rps=12138.3 (overall: 13629.9) avg_msec=22.874 (overall: 18.812) LRANGE_600 (first 600 elements): rps=14142.3 (overall: 13673.4) avg_msec=18.475 (overall: 18.782) LRANGE_600 (first 600 elements): rps=13035.4 (overall: 13623.3) avg_msec=20.272 (overall: 18.894) LRANGE_600 (first 600 elements): rps=15713.1 (overall: 13773.7) avg_msec=17.547 (overall: 18.783) LRANGE_600 (first 600 elements): rps=14346.6 (overall: 13812.2) avg_msec=18.060 (overall: 18.733) LRANGE_600 (first 600 elements): rps=14628.5 (overall: 13863.9) avg_msec=18.654 (overall: 18.728) LRANGE_600 (first 600 elements): rps=11595.3 (overall: 13726.7) avg_msec=23.469 (overall: 18.970) LRANGE_600 (first 600 elements): rps=11945.1 (overall: 13625.8) avg_msec=22.389 (overall: 19.140) LRANGE_600 (first 600 elements): rps=10545.8 (overall: 13463.2) avg_msec=25.641 (overall: 19.408) LRANGE_600 (first 600 elements): rps=13932.5 (overall: 13486.8) avg_msec=19.653 (overall: 19.421) LRANGE_600 (first 600 elements): rps=15291.3 (overall: 13574.0) avg_msec=16.731 (overall: 19.275) LRANGE_600 (first 600 elements): rps=13914.1 (overall: 13589.7) avg_msec=18.674 (overall: 19.246) LRANGE_600 (first 600 elements): rps=14737.1 (overall: 13639.7) avg_msec=18.374 (overall: 19.205) LRANGE_600 (first 600 elements): rps=13384.9 (overall: 13629.0) avg_msec=19.612 (overall: 19.222) LRANGE_600 (first 600 elements): rps=14745.0 (overall: 13673.7) avg_msec=18.298 (overall: 19.182) LRANGE_600 (first 600 elements): rps=12841.3 (overall: 13641.5) avg_msec=20.760 (overall: 19.240) LRANGE_600 (first 600 elements): rps=13581.7 (overall: 13639.3) avg_msec=19.986 (overall: 19.267) LRANGE_600 (first 600 elements): rps=11286.9 (overall: 13555.2) avg_msec=23.852 (overall: 19.403) LRANGE_600 (first 600 elements): rps=13952.0 (overall: 13568.9) avg_msec=19.070 (overall: 19.392) ====== LRANGE_600 (first 600 elements) ====== 299s 100000 requests completed in 7.36 seconds 299s 50 parallel clients 299s 3 bytes payload 299s keep alive: 1 299s host configuration "save": 3600 1 300 100 60 10000 299s host configuration "appendonly": no 299s multi-thread: no 299s 299s Latency by percentile distribution: 299s 0.000% <= 1.295 milliseconds (cumulative count 10) 299s 50.000% <= 19.727 milliseconds (cumulative count 50010) 299s 75.000% <= 25.871 milliseconds (cumulative count 75040) 299s 87.500% <= 29.135 milliseconds (cumulative count 87510) 299s 93.750% <= 31.151 milliseconds (cumulative count 93810) 299s 96.875% <= 32.575 milliseconds (cumulative count 96930) 299s 98.438% <= 34.175 milliseconds (cumulative count 98450) 299s 99.219% <= 35.583 milliseconds (cumulative count 99230) 299s 99.609% <= 36.767 milliseconds (cumulative count 99610) 299s 99.805% <= 37.375 milliseconds (cumulative count 99810) 299s 99.902% <= 38.079 milliseconds (cumulative count 99920) 299s 99.951% <= 38.655 milliseconds (cumulative count 99960) 299s 99.976% <= 38.975 milliseconds (cumulative count 99980) 299s 99.988% <= 39.167 milliseconds (cumulative count 99990) 299s 99.994% <= 39.359 milliseconds (cumulative count 100000) 299s 100.000% <= 39.359 milliseconds (cumulative count 100000) 299s 299s Cumulative distribution of latencies: 299s 0.000% <= 0.103 milliseconds (cumulative count 0) 299s 0.020% <= 1.303 milliseconds (cumulative count 20) 299s 0.080% <= 1.407 milliseconds (cumulative count 80) 299s 0.100% <= 1.503 milliseconds (cumulative count 100) 299s 0.170% <= 1.607 milliseconds (cumulative count 170) 299s 0.250% <= 1.703 milliseconds (cumulative count 250) 299s 0.330% <= 1.807 milliseconds (cumulative count 330) 299s 0.400% <= 1.903 milliseconds (cumulative count 400) 299s 0.520% <= 2.007 milliseconds (cumulative count 520) 299s 0.670% <= 2.103 milliseconds (cumulative count 670) 299s 2.020% <= 3.103 milliseconds (cumulative count 2020) 299s 2.860% <= 4.103 milliseconds (cumulative count 2860) 299s 3.810% <= 5.103 milliseconds (cumulative count 3810) 299s 5.120% <= 6.103 milliseconds (cumulative count 5120) 299s 6.680% <= 7.103 milliseconds (cumulative count 6680) 299s 8.620% <= 8.103 milliseconds (cumulative count 8620) 299s 11.050% <= 9.103 milliseconds (cumulative count 11050) 299s 14.330% <= 10.103 milliseconds (cumulative count 14330) 299s 17.790% <= 11.103 milliseconds (cumulative count 17790) 299s 21.150% <= 12.103 milliseconds (cumulative count 21150) 299s 25.410% <= 13.103 milliseconds (cumulative count 25410) 299s 29.430% <= 14.103 milliseconds (cumulative count 29430) 299s 32.910% <= 15.103 milliseconds (cumulative count 32910) 299s 37.020% <= 16.103 milliseconds (cumulative count 37020) 299s 41.070% <= 17.103 milliseconds (cumulative count 41070) 299s 44.820% <= 18.111 milliseconds (cumulative count 44820) 299s 47.870% <= 19.103 milliseconds (cumulative count 47870) 299s 51.400% <= 20.111 milliseconds (cumulative count 51400) 299s 54.870% <= 21.103 milliseconds (cumulative count 54870) 299s 58.650% <= 22.111 milliseconds (cumulative count 58650) 299s 63.040% <= 23.103 milliseconds (cumulative count 63040) 299s 67.940% <= 24.111 milliseconds (cumulative count 67940) 299s 71.860% <= 25.103 milliseconds (cumulative count 71860) 299s 75.950% <= 26.111 milliseconds (cumulative count 75950) 299s 80.110% <= 27.103 milliseconds (cumulative count 80110) 299s 83.870% <= 28.111 milliseconds (cumulative count 83870) 299s 87.410% <= 29.103 milliseconds (cumulative count 87410) 299s 90.390% <= 30.111 milliseconds (cumulative count 90390) 299s 93.640% <= 31.103 milliseconds (cumulative count 93640) 299s 96.110% <= 32.111 milliseconds (cumulative count 96110) 299s 97.530% <= 33.119 milliseconds (cumulative count 97530) 299s 98.410% <= 34.111 milliseconds (cumulative count 98410) 299s 99.020% <= 35.103 milliseconds (cumulative count 99020) 299s 99.420% <= 36.127 milliseconds (cumulative count 99420) 299s 99.710% <= 37.119 milliseconds (cumulative count 99710) 299s 99.920% <= 38.111 milliseconds (cumulative count 99920) 299s 99.980% <= 39.103 milliseconds (cumulative count 99980) 299s 100.000% <= 40.127 milliseconds (cumulative count 100000) 299s 299s Summary: 299s throughput summary: 13579.58 requests per second 299s latency summary (msec): 299s avg min p50 p95 p99 max 299s 19.387 1.288 19.727 31.599 35.103 39.359 299s MSET (10 keys): rps=155776.9 (overall: 247468.3) avg_msec=1.867 (overall: 1.867) ====== MSET (10 keys) ====== 299s 100000 requests completed in 0.40 seconds 299s 50 parallel clients 299s 3 bytes payload 299s keep alive: 1 299s host configuration "save": 3600 1 300 100 60 10000 299s host configuration "appendonly": no 299s multi-thread: no 299s 299s Latency by percentile distribution: 299s 0.000% <= 0.455 milliseconds (cumulative count 10) 299s 50.000% <= 1.895 milliseconds (cumulative count 50940) 299s 75.000% <= 2.063 milliseconds (cumulative count 75200) 299s 87.500% <= 2.167 milliseconds (cumulative count 88100) 299s 93.750% <= 2.239 milliseconds (cumulative count 94240) 299s 96.875% <= 2.295 milliseconds (cumulative count 96990) 299s 98.438% <= 2.359 milliseconds (cumulative count 98500) 299s 99.219% <= 2.423 milliseconds (cumulative count 99220) 299s 99.609% <= 2.487 milliseconds (cumulative count 99620) 299s 99.805% <= 2.543 milliseconds (cumulative count 99840) 299s 99.902% <= 2.583 milliseconds (cumulative count 99920) 299s 99.951% <= 2.615 milliseconds (cumulative count 99960) 299s 99.976% <= 2.639 milliseconds (cumulative count 99990) 299s 99.994% <= 2.727 milliseconds (cumulative count 100000) 299s 100.000% <= 2.727 milliseconds (cumulative count 100000) 299s 299s Cumulative distribution of latencies: 299s 0.000% <= 0.103 milliseconds (cumulative count 0) 299s 0.040% <= 0.503 milliseconds (cumulative count 40) 299s 0.080% <= 0.607 milliseconds (cumulative count 80) 299s 0.100% <= 0.703 milliseconds (cumulative count 100) 299s 0.120% <= 1.007 milliseconds (cumulative count 120) 299s 0.240% <= 1.103 milliseconds (cumulative count 240) 299s 1.660% <= 1.207 milliseconds (cumulative count 1660) 299s 5.400% <= 1.303 milliseconds (cumulative count 5400) 299s 7.330% <= 1.407 milliseconds (cumulative count 7330) 299s 8.060% <= 1.503 milliseconds (cumulative count 8060) 299s 11.920% <= 1.607 milliseconds (cumulative count 11920) 299s 23.250% <= 1.703 milliseconds (cumulative count 23250) 299s 37.700% <= 1.807 milliseconds (cumulative count 37700) 299s 52.080% <= 1.903 milliseconds (cumulative count 52080) 299s 67.350% <= 2.007 milliseconds (cumulative count 67350) 299s 80.580% <= 2.103 milliseconds (cumulative count 80580) 299s 100.000% <= 3.103 milliseconds (cumulative count 100000) 299s 299s Summary: 299s throughput summary: 248138.95 requests per second 299s latency summary (msec): 299s avg min p50 p95 p99 max 299s 1.872 0.448 1.895 2.255 2.407 2.727 299s 299s autopkgtest [16:15:43]: test 0002-benchmark: -----------------------] 300s autopkgtest [16:15:44]: test 0002-benchmark: - - - - - - - - - - results - - - - - - - - - - 300s 0002-benchmark PASS 300s autopkgtest [16:15:44]: test 0003-redis-check-aof: preparing testbed 301s Reading package lists... 301s Building dependency tree... 301s Reading state information... 301s Starting pkgProblemResolver with broken count: 0 301s Starting 2 pkgProblemResolver with broken count: 0 301s Done 302s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 304s autopkgtest [16:15:48]: test 0003-redis-check-aof: [----------------------- 304s autopkgtest [16:15:48]: test 0003-redis-check-aof: -----------------------] 305s autopkgtest [16:15:49]: test 0003-redis-check-aof: - - - - - - - - - - results - - - - - - - - - - 305s 0003-redis-check-aof PASS 305s autopkgtest [16:15:49]: test 0004-redis-check-rdb: preparing testbed 305s Reading package lists... 306s Building dependency tree... 306s Reading state information... 306s Starting pkgProblemResolver with broken count: 0 306s Starting 2 pkgProblemResolver with broken count: 0 306s Done 307s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 308s autopkgtest [16:15:52]: test 0004-redis-check-rdb: [----------------------- 314s OK 314s [offset 0] Checking RDB file /var/lib/redis/dump.rdb 314s [offset 27] AUX FIELD redis-ver = '7.0.15' 314s [offset 41] AUX FIELD redis-bits = '64' 314s [offset 53] AUX FIELD ctime = '1742055358' 314s [offset 68] AUX FIELD used-mem = '1622800' 314s [offset 80] AUX FIELD aof-base = '0' 314s [offset 82] Selecting DB ID 0 314s [offset 7184] Checksum OK 314s [offset 7184] \o/ RDB looks OK! \o/ 314s [info] 4 keys read 314s [info] 0 expires 314s [info] 0 already expired 314s autopkgtest [16:15:58]: test 0004-redis-check-rdb: -----------------------] 315s autopkgtest [16:15:59]: test 0004-redis-check-rdb: - - - - - - - - - - results - - - - - - - - - - 315s 0004-redis-check-rdb PASS 315s autopkgtest [16:15:59]: test 0005-cjson: preparing testbed 316s Reading package lists... 316s Building dependency tree... 316s Reading state information... 316s Starting pkgProblemResolver with broken count: 0 316s Starting 2 pkgProblemResolver with broken count: 0 316s Done 317s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 319s autopkgtest [16:16:03]: test 0005-cjson: [----------------------- 324s 325s autopkgtest [16:16:09]: test 0005-cjson: -----------------------] 325s autopkgtest [16:16:09]: test 0005-cjson: - - - - - - - - - - results - - - - - - - - - - 325s 0005-cjson PASS 326s autopkgtest [16:16:10]: @@@@@@@@@@@@@@@@@@@@ summary 326s 0001-redis-cli PASS 326s 0002-benchmark PASS 326s 0003-redis-check-aof PASS 326s 0004-redis-check-rdb PASS 326s 0005-cjson PASS 332s nova [W] Using flock in prodstack6-arm64 332s Creating nova instance adt-plucky-arm64-redis-20250315-161044-juju-7f2275-prod-proposed-migration-environment-2-f907425e-6e99-464e-8d26-aa1612b7b34e from image adt/ubuntu-plucky-arm64-server-20250315.img (UUID bd6e766c-b51f-4b53-86d6-23aa4d18f524)... 332s nova [W] Timed out waiting for 593430fc-a50c-4ba0-8d82-3dc8785ee4bc to get deleted.