Started by remote host 172.23.110.241 [EnvInject] - Loading node environment variables. Building remotely on slv-sc2302-32g-12c (P0 jython_slave) in workspace /data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1 [WS-CLEANUP] Deleting project workspace... Running Prebuild steps [centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1] $ /bin/sh -xe /tmp/jenkins695516544814780028.sh ++ echo collections-failover_and_recovery_dgm_7.0_P1-Sep-14-02:16:14-7.1.0-1277 ++ awk '{split($0,r,"-");print r[1],r[2]}' + desc='collections failover_and_recovery_dgm_7.0_P1' + echo Desc: 7.1.0-1277 - collections failover_and_recovery_dgm_7.0_P1 - centos Desc: 7.1.0-1277 - collections failover_and_recovery_dgm_7.0_P1 - centos + echo newState=available + newState=available Success build forhudson.tasks.Shell@2485b2b5 [description-setter] Description set: 7.1.0-1277 - collections failover_and_recovery_dgm_7.0_P1 - centos Success build forhudson.plugins.descriptionsetter.DescriptionSetterBuilder@5080f310 [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties file path 'propfile' [EnvInject] - Variables injected successfully. Success build fororg.jenkinsci.plugins.envinject.EnvInjectBuilder@678206a5 Cloning the remote Git repository Using shallow clone Cloning repository git://github.com/couchbaselabs/TAF.git > /usr/bin/git init /data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1 # timeout=10 Fetching upstream changes from git://github.com/couchbaselabs/TAF.git > /usr/bin/git --version # timeout=10 > /usr/bin/git fetch --tags --progress git://github.com/couchbaselabs/TAF.git +refs/heads/*:refs/remotes/origin/* --depth=1 # timeout=30 > /usr/bin/git config remote.origin.url git://github.com/couchbaselabs/TAF.git # timeout=10 > /usr/bin/git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10 > /usr/bin/git config remote.origin.url git://github.com/couchbaselabs/TAF.git # timeout=10 Fetching upstream changes from git://github.com/couchbaselabs/TAF.git > /usr/bin/git fetch --tags --progress git://github.com/couchbaselabs/TAF.git +refs/heads/*:refs/remotes/origin/* --depth=1 # timeout=30 > /usr/bin/git rev-parse origin/master^{commit} # timeout=10 Checking out Revision 04ebae75e956698e23f219fd18641f648516c001 (origin/master) > /usr/bin/git config core.sparsecheckout # timeout=10 > /usr/bin/git checkout -f 04ebae75e956698e23f219fd18641f648516c001 > /usr/bin/git rev-list 04ebae75e956698e23f219fd18641f648516c001 # timeout=10 > /usr/bin/git tag -a -f -m Jenkins Build #142505 jenkins-test_suite_executor-TAF-142505 # timeout=10 No emails were triggered. [centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1] $ /bin/sh -xe /tmp/jenkins3369600515103346763.sh + echo Desc: collections-failover_and_recovery_dgm_7.0_P1-Sep-14-02:16:14-7.1.0-1277 Desc: collections-failover_and_recovery_dgm_7.0_P1-Sep-14-02:16:14-7.1.0-1277 [description-setter] Could not determine description. [centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1] $ /bin/bash /tmp/jenkins5430948040324185482.sh Jython 2.7.0 ############################################## Installing dependencies from requirements.txt ############################################## Requirement already satisfied (use --upgrade to upgrade): futures==3.3.0 in /opt/jython/Lib/site-packages (from -r requirements.txt (line 1)) Requirement already satisfied (use --upgrade to upgrade): requests==2.24.0 in /opt/jython/Lib/site-packages (from -r requirements.txt (line 2)) Requirement already satisfied (use --upgrade to upgrade): urllib3==1.25.10 in /opt/jython/Lib/site-packages (from -r requirements.txt (line 3)) Requirement already satisfied (use --upgrade to upgrade): ruamel.yaml==0.16.12 in /opt/jython/Lib/site-packages (from -r requirements.txt (line 4)) Requirement already satisfied (use --upgrade to upgrade): six==1.15.0 in /opt/jython/Lib/site-packages (from -r requirements.txt (line 5)) Cleaning up... #################### Done #################### [centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1] $ /bin/bash /tmp/jenkins7495544370797795606.sh Cloning into 'guides'... [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties file path 'rerun_props_file' [EnvInject] - Variables injected successfully. [centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1] $ /bin/bash /tmp/jenkins2544471713227371009.sh core file size (blocks, -c) 0 data seg size (kbytes, -d) unlimited scheduling priority (-e) 0 file size (blocks, -f) unlimited pending signals (-i) 119804 max locked memory (kbytes, -l) 64 max memory size (kbytes, -m) unlimited open files (-n) 202400 pipe size (512 bytes, -p) 8 POSIX message queues (bytes, -q) 819200 real-time priority (-r) 0 stack size (kbytes, -s) 8192 cpu time (seconds, -t) unlimited max user processes (-u) 119804 virtual memory (kbytes, -v) unlimited file locks (-x) unlimited 7.1.0-1277 Set ALLOW_HTP to False so test could run. the major release is 7 "172.23.106.45","172.23.123.119","172.23.121.221","172.23.106.8","172.23.121.222" Searching for httplib2 Best match: httplib2 0.9.2 Adding httplib2 0.9.2 to easy-install.pth file Using /usr/lib/python2.7/site-packages Processing dependencies for httplib2 Finished processing dependencies for httplib2 Requirement already satisfied (use --upgrade to upgrade): requests in /opt/jython/Lib/site-packages Cleaning up... Requirement already satisfied (use --upgrade to upgrade): futures in /opt/jython/Lib/site-packages Cleaning up... Cloning into 'testrunner'... [global] username:root password:couchbase data_path:/data [membase] rest_username:Administrator rest_password:password [servers] 1:_1 2:_2 3:_3 4:_4 5:_5 [_1] ip:dynamic port:8091 n1ql_port:18093 index_port:9102 [_2] ip:dynamic port:8091 [_3] ip:dynamic port:8091 [_4] ip:dynamic port:8091 [_5] ip:dynamic port:8091 python3 scripts/populateIni.py -s "172.23.106.45","172.23.123.119","172.23.121.221","172.23.106.8","172.23.121.222" -d None -a None -i /tmp/testexec_reformat.38389.ini -p centos -o /tmp/testexec.38389.ini -k {} INFO:root:SSH Connecting to 172.23.106.45 with username:root, attempt#1 of 5 INFO:root:SSH Connecting to 172.23.123.119 with username:root, attempt#1 of 5 INFO:root:SSH Connecting to 172.23.121.221 with username:root, attempt#1 of 5 INFO:root:SSH Connecting to 172.23.106.8 with username:root, attempt#1 of 5 INFO:root:SSH Connecting to 172.23.121.222 with username:root, attempt#1 of 5 INFO:root:SSH Connected to 172.23.121.221 as root INFO:root:SSH Connected to 172.23.106.8 as root INFO:root:SSH Connected to 172.23.121.222 as root INFO:root:SSH Connected to 172.23.123.119 as root INFO:root:SSH Connected to 172.23.106.45 as root INFO:root:os_distro: CentOS, os_version: centos 7, is_linux_distro: True INFO:root:os_distro: CentOS, os_version: centos 7, is_linux_distro: True INFO:root:os_distro: CentOS, os_version: centos 7, is_linux_distro: True INFO:root:os_distro: CentOS, os_version: centos 7, is_linux_distro: True INFO:root:os_distro: CentOS, os_version: centos 7, is_linux_distro: True INFO:root:extract_remote_info-->distribution_type: CentOS, distribution_version: centos 7 INFO:root:running command.raw on 172.23.121.222: sh -c 'if [[ "$OSTYPE" == "darwin"* ]]; then sysctl hw.memsize|grep -Eo [0-9]; else grep MemTotal /proc/meminfo|grep -Eo [0-9]; fi' INFO:root:extract_remote_info-->distribution_type: CentOS, distribution_version: centos 7 INFO:root:running command.raw on 172.23.106.8: sh -c 'if [[ "$OSTYPE" == "darwin"* ]]; then sysctl hw.memsize|grep -Eo [0-9]; else grep MemTotal /proc/meminfo|grep -Eo [0-9]; fi' INFO:root:extract_remote_info-->distribution_type: CentOS, distribution_version: centos 7 INFO:root:running command.raw on 172.23.123.119: sh -c 'if [[ "$OSTYPE" == "darwin"* ]]; then sysctl hw.memsize|grep -Eo [0-9]; else grep MemTotal /proc/meminfo|grep -Eo [0-9]; fi' INFO:root:extract_remote_info-->distribution_type: CentOS, distribution_version: centos 7 INFO:root:running command.raw on 172.23.121.221: sh -c 'if [[ "$OSTYPE" == "darwin"* ]]; then sysctl hw.memsize|grep -Eo [0-9]; else grep MemTotal /proc/meminfo|grep -Eo [0-9]; fi' INFO:root:command executed successfully with root INFO:root:command executed successfully with root INFO:root:command executed successfully with root INFO:root:command executed successfully with root INFO:root:extract_remote_info-->distribution_type: CentOS, distribution_version: centos 7 INFO:root:running command.raw on 172.23.106.45: sh -c 'if [[ "$OSTYPE" == "darwin"* ]]; then sysctl hw.memsize|grep -Eo [0-9]; else grep MemTotal /proc/meminfo|grep -Eo [0-9]; fi' INFO:root:command executed successfully with root in main the ini file is /tmp/testexec_reformat.38389.ini the given server info is "172.23.106.45","172.23.123.119","172.23.121.221","172.23.106.8","172.23.121.222" Collecting memory info from 172.23.121.222 Collecting memory info from 172.23.106.8 Collecting memory info from 172.23.123.119 Collecting memory info from 172.23.121.221 Collecting memory info from 172.23.106.45 the servers memory info is [('172.23.123.119', 4103196), ('172.23.121.222', 4103212), ('172.23.121.221', 4103212), ('172.23.106.8', 4103360), ('172.23.106.45', 4106104)] [global] username:root password:couchbase data_path:/data [membase] rest_username:Administrator rest_password:password [servers] 1:_1 2:_2 3:_3 4:_4 5:_5 [_1] ip:172.23.123.119 port:8091 n1ql_port:18093 index_port:9102 [_2] ip:172.23.121.222 port:8091 [_3] ip:172.23.121.221 port:8091 [_4] ip:172.23.106.8 port:8091 [_5] ip:172.23.106.45 port:8091 extra install is Starting a Gradle Daemon, 17 busy Daemons could not be reused, use --status for details > Configure project : Executing 'gradle clean' Using Transaction_client :: 1.1.0 Running: /opt/jython/bin/jython -J-cp /data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/build/classes/java/main:/data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/build/resources/main:/root/.gradle/caches/modules-2/files-2.1/com.jcraft/jsch/0.1.54/da3584329a263616e277e15462b387addd1b208d/jsch-0.1.54.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-slf4j-impl/2.11.1/4b41b53a3a2d299ce381a69d165381ca19f62912/log4j-slf4j-impl-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/commons-cli/commons-cli/1.4/c51c00206bb913cd8612b24abd9fa98ae89719b1/commons-cli-1.4.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/couchbase-transactions/1.1.0/a295271b66684b05dd5345d7b7a6232e03054ef9/couchbase-transactions-1.1.0.jar:/root/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-api/1.7.25/da76ca59f6a57ee3102f8f9bd9cee742973efa8a/slf4j-api-1.7.25.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-core/2.11.1/592a48674c926b01a9a747c7831bcd82a9e6d6e4/log4j-core-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-api/2.11.1/268f0fe4df3eefe052b57c87ec48517d64fb2a10/log4j-api-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/java-client/3.0.7/892363ab817451b9bb8fa91861c0d66c480056eb/java-client-3.0.7.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/core-io/2.0.8/d21bea503f8be1ce9d68143e4aaf2f203a48d2b7/core-io-2.0.8.jar:/root/.gradle/caches/modules-2/files-2.1/io.projectreactor/reactor-core/3.3.8.RELEASE/e6b2e8dc1f6548ae216b2ecdee0c0a32176317a/reactor-core-3.3.8.RELEASE.jar:/root/.gradle/caches/modules-2/files-2.1/org.reactivestreams/reactive-streams/1.0.3/d9fb7a7926ffa635b3dcaa5049fb2bfa25b3e7d0/reactive-streams-1.0.3.jar scripts/ssh.py -i /tmp/testexec_root.38389.ini iptables -F Running: /opt/jython/bin/jython -J-cp /data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/build/classes/java/main:/data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/build/resources/main:/root/.gradle/caches/modules-2/files-2.1/com.jcraft/jsch/0.1.54/da3584329a263616e277e15462b387addd1b208d/jsch-0.1.54.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-slf4j-impl/2.11.1/4b41b53a3a2d299ce381a69d165381ca19f62912/log4j-slf4j-impl-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/commons-cli/commons-cli/1.4/c51c00206bb913cd8612b24abd9fa98ae89719b1/commons-cli-1.4.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/couchbase-transactions/1.1.0/a295271b66684b05dd5345d7b7a6232e03054ef9/couchbase-transactions-1.1.0.jar:/root/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-api/1.7.25/da76ca59f6a57ee3102f8f9bd9cee742973efa8a/slf4j-api-1.7.25.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-core/2.11.1/592a48674c926b01a9a747c7831bcd82a9e6d6e4/log4j-core-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-api/2.11.1/268f0fe4df3eefe052b57c87ec48517d64fb2a10/log4j-api-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/java-client/3.0.7/892363ab817451b9bb8fa91861c0d66c480056eb/java-client-3.0.7.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/core-io/2.0.8/d21bea503f8be1ce9d68143e4aaf2f203a48d2b7/core-io-2.0.8.jar:/root/.gradle/caches/modules-2/files-2.1/io.projectreactor/reactor-core/3.3.8.RELEASE/e6b2e8dc1f6548ae216b2ecdee0c0a32176317a/reactor-core-3.3.8.RELEASE.jar:/root/.gradle/caches/modules-2/files-2.1/org.reactivestreams/reactive-streams/1.0.3/d9fb7a7926ffa635b3dcaa5049fb2bfa25b3e7d0/reactive-streams-1.0.3.jar scripts/eagles_all_around.py -i /tmp/testexec_root.38389.ini iptables -F Running: /opt/jython/bin/jython -J-cp /data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/build/classes/java/main:/data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/build/resources/main:/root/.gradle/caches/modules-2/files-2.1/com.jcraft/jsch/0.1.54/da3584329a263616e277e15462b387addd1b208d/jsch-0.1.54.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-slf4j-impl/2.11.1/4b41b53a3a2d299ce381a69d165381ca19f62912/log4j-slf4j-impl-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/commons-cli/commons-cli/1.4/c51c00206bb913cd8612b24abd9fa98ae89719b1/commons-cli-1.4.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/couchbase-transactions/1.1.0/a295271b66684b05dd5345d7b7a6232e03054ef9/couchbase-transactions-1.1.0.jar:/root/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-api/1.7.25/da76ca59f6a57ee3102f8f9bd9cee742973efa8a/slf4j-api-1.7.25.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-core/2.11.1/592a48674c926b01a9a747c7831bcd82a9e6d6e4/log4j-core-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-api/2.11.1/268f0fe4df3eefe052b57c87ec48517d64fb2a10/log4j-api-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/java-client/3.0.7/892363ab817451b9bb8fa91861c0d66c480056eb/java-client-3.0.7.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/core-io/2.0.8/d21bea503f8be1ce9d68143e4aaf2f203a48d2b7/core-io-2.0.8.jar:/root/.gradle/caches/modules-2/files-2.1/io.projectreactor/reactor-core/3.3.8.RELEASE/e6b2e8dc1f6548ae216b2ecdee0c0a32176317a/reactor-core-3.3.8.RELEASE.jar:/root/.gradle/caches/modules-2/files-2.1/org.reactivestreams/reactive-streams/1.0.3/d9fb7a7926ffa635b3dcaa5049fb2bfa25b3e7d0/reactive-streams-1.0.3.jar:build/classes/java/main scripts/install.py -i /tmp/testexec_root.38389.ini iptables -F Running: /opt/jython/bin/jython -J-cp /data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/build/classes/java/main:/data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/build/resources/main:/root/.gradle/caches/modules-2/files-2.1/com.jcraft/jsch/0.1.54/da3584329a263616e277e15462b387addd1b208d/jsch-0.1.54.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-slf4j-impl/2.11.1/4b41b53a3a2d299ce381a69d165381ca19f62912/log4j-slf4j-impl-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/commons-cli/commons-cli/1.4/c51c00206bb913cd8612b24abd9fa98ae89719b1/commons-cli-1.4.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/couchbase-transactions/1.1.0/a295271b66684b05dd5345d7b7a6232e03054ef9/couchbase-transactions-1.1.0.jar:/root/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-api/1.7.25/da76ca59f6a57ee3102f8f9bd9cee742973efa8a/slf4j-api-1.7.25.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-core/2.11.1/592a48674c926b01a9a747c7831bcd82a9e6d6e4/log4j-core-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-api/2.11.1/268f0fe4df3eefe052b57c87ec48517d64fb2a10/log4j-api-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/java-client/3.0.7/892363ab817451b9bb8fa91861c0d66c480056eb/java-client-3.0.7.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/core-io/2.0.8/d21bea503f8be1ce9d68143e4aaf2f203a48d2b7/core-io-2.0.8.jar:/root/.gradle/caches/modules-2/files-2.1/io.projectreactor/reactor-core/3.3.8.RELEASE/e6b2e8dc1f6548ae216b2ecdee0c0a32176317a/reactor-core-3.3.8.RELEASE.jar:/root/.gradle/caches/modules-2/files-2.1/org.reactivestreams/reactive-streams/1.0.3/d9fb7a7926ffa635b3dcaa5049fb2bfa25b3e7d0/reactive-streams-1.0.3.jar:build/classes/java/main:src/main/resources testrunner.py -i /tmp/testexec_root.38389.ini iptables -F Running: /opt/jython/bin/jython -J-cp /data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/build/classes/java/main:/data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/build/resources/main:/root/.gradle/caches/modules-2/files-2.1/com.jcraft/jsch/0.1.54/da3584329a263616e277e15462b387addd1b208d/jsch-0.1.54.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-slf4j-impl/2.11.1/4b41b53a3a2d299ce381a69d165381ca19f62912/log4j-slf4j-impl-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/commons-cli/commons-cli/1.4/c51c00206bb913cd8612b24abd9fa98ae89719b1/commons-cli-1.4.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/couchbase-transactions/1.1.0/a295271b66684b05dd5345d7b7a6232e03054ef9/couchbase-transactions-1.1.0.jar:/root/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-api/1.7.25/da76ca59f6a57ee3102f8f9bd9cee742973efa8a/slf4j-api-1.7.25.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-core/2.11.1/592a48674c926b01a9a747c7831bcd82a9e6d6e4/log4j-core-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-api/2.11.1/268f0fe4df3eefe052b57c87ec48517d64fb2a10/log4j-api-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/java-client/3.0.7/892363ab817451b9bb8fa91861c0d66c480056eb/java-client-3.0.7.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/core-io/2.0.8/d21bea503f8be1ce9d68143e4aaf2f203a48d2b7/core-io-2.0.8.jar:/root/.gradle/caches/modules-2/files-2.1/io.projectreactor/reactor-core/3.3.8.RELEASE/e6b2e8dc1f6548ae216b2ecdee0c0a32176317a/reactor-core-3.3.8.RELEASE.jar:/root/.gradle/caches/modules-2/files-2.1/org.reactivestreams/reactive-streams/1.0.3/d9fb7a7926ffa635b3dcaa5049fb2bfa25b3e7d0/reactive-streams-1.0.3.jar:build/classes/java/main:src/main/resources scripts/rerun_jobs.py -i /tmp/testexec_root.38389.ini iptables -F > Task :iptables ERROR StatusLogger No Log4j 2 configuration file found. Using default configuration (logging only errors to the console), or user programmatically provided configurations. Set system property 'log4j2.debug' to show Log4j 2 internal initialization logging. See https://logging.apache.org/log4j/2.x/manual/configuration.html for instructions on how to configure Log4j 2 172.23.106.8 172.23.123.119 172.23.121.221 172.23.121.222 172.23.106.45 Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.9/userguide/command_line_interface.html#sec:command_line_warnings BUILD SUCCESSFUL in 15s 1 actionable task: 1 executed Already on 'master' From https://github.com/couchbase/testrunner * branch master -> FETCH_HEAD Already up-to-date. 7.1.0-1277 2021-09-14 02:15:28,646 - root - WARNING - URL: is not valid, will use version to locate build 2021-09-14 02:15:28,647 - root - INFO - SSH Connecting to 172.23.123.119 with username:root, attempt#1 of 5 2021-09-14 02:15:28,757 - root - INFO - SSH Connected to 172.23.123.119 as root 2021-09-14 02:15:29,062 - root - INFO - os_distro: CentOS, os_version: centos 7, is_linux_distro: True 2021-09-14 02:15:29,377 - root - INFO - extract_remote_info-->distribution_type: CentOS, distribution_version: centos 7 2021-09-14 02:15:29,378 - root - INFO - SSH Connecting to 172.23.121.222 with username:root, attempt#1 of 5 2021-09-14 02:15:29,484 - root - INFO - SSH Connected to 172.23.121.222 as root 2021-09-14 02:15:29,757 - root - INFO - os_distro: CentOS, os_version: centos 7, is_linux_distro: True 2021-09-14 02:15:30,055 - root - INFO - extract_remote_info-->distribution_type: CentOS, distribution_version: centos 7 2021-09-14 02:15:30,056 - root - INFO - SSH Connecting to 172.23.121.221 with username:root, attempt#1 of 5 2021-09-14 02:15:30,164 - root - INFO - SSH Connected to 172.23.121.221 as root 2021-09-14 02:15:30,436 - root - INFO - os_distro: CentOS, os_version: centos 7, is_linux_distro: True 2021-09-14 02:15:30,746 - root - INFO - extract_remote_info-->distribution_type: CentOS, distribution_version: centos 7 2021-09-14 02:15:30,748 - root - INFO - SSH Connecting to 172.23.106.8 with username:root, attempt#1 of 5 2021-09-14 02:15:30,851 - root - INFO - SSH Connected to 172.23.106.8 as root 2021-09-14 02:15:31,144 - root - INFO - os_distro: CentOS, os_version: centos 7, is_linux_distro: True 2021-09-14 02:15:31,445 - root - INFO - extract_remote_info-->distribution_type: CentOS, distribution_version: centos 7 2021-09-14 02:15:31,447 - root - INFO - SSH Connecting to 172.23.106.45 with username:root, attempt#1 of 5 2021-09-14 02:15:31,647 - root - INFO - SSH Connected to 172.23.106.45 as root 2021-09-14 02:15:32,093 - root - INFO - os_distro: CentOS, os_version: centos 7, is_linux_distro: True 2021-09-14 02:15:32,530 - root - INFO - extract_remote_info-->distribution_type: CentOS, distribution_version: centos 7 2021-09-14 02:15:32,532 - root - INFO - SSH Connecting to 172.23.123.119 with username:root, attempt#1 of 5 2021-09-14 02:15:32,634 - root - INFO - SSH Connected to 172.23.123.119 as root 2021-09-14 02:15:32,915 - root - INFO - os_distro: CentOS, os_version: centos 7, is_linux_distro: True 2021-09-14 02:15:33,223 - root - INFO - extract_remote_info-->distribution_type: CentOS, distribution_version: centos 7 2021-09-14 02:15:33,224 - root - INFO - SSH Connecting to 172.23.121.222 with username:root, attempt#1 of 5 2021-09-14 02:15:33,328 - root - INFO - SSH Connected to 172.23.121.222 as root 2021-09-14 02:15:33,605 - root - INFO - os_distro: CentOS, os_version: centos 7, is_linux_distro: True 2021-09-14 02:15:33,909 - root - INFO - extract_remote_info-->distribution_type: CentOS, distribution_version: centos 7 2021-09-14 02:15:33,910 - root - INFO - SSH Connecting to 172.23.121.221 with username:root, attempt#1 of 5 2021-09-14 02:15:34,010 - root - INFO - SSH Connected to 172.23.121.221 as root 2021-09-14 02:15:34,267 - root - INFO - os_distro: CentOS, os_version: centos 7, is_linux_distro: True 2021-09-14 02:15:34,565 - root - INFO - extract_remote_info-->distribution_type: CentOS, distribution_version: centos 7 2021-09-14 02:15:34,566 - root - INFO - SSH Connecting to 172.23.106.8 with username:root, attempt#1 of 5 2021-09-14 02:15:34,668 - root - INFO - SSH Connected to 172.23.106.8 as root 2021-09-14 02:15:34,944 - root - INFO - os_distro: CentOS, os_version: centos 7, is_linux_distro: True 2021-09-14 02:15:35,245 - root - INFO - extract_remote_info-->distribution_type: CentOS, distribution_version: centos 7 2021-09-14 02:15:35,246 - root - INFO - SSH Connecting to 172.23.106.45 with username:root, attempt#1 of 5 2021-09-14 02:15:35,435 - root - INFO - SSH Connected to 172.23.106.45 as root 2021-09-14 02:15:35,901 - root - INFO - os_distro: CentOS, os_version: centos 7, is_linux_distro: True 2021-09-14 02:15:36,347 - root - INFO - extract_remote_info-->distribution_type: CentOS, distribution_version: centos 7 2021-09-14 02:15:36,347 - root - INFO - Trying to check is this url alive: http://172.23.126.166/builds/latestbuilds/couchbase-server/neo/1277/couchbase-server-enterprise-7.1.0-1277-centos7.x86_64.rpm 2021-09-14 02:15:36,350 - root - INFO - This url http://172.23.126.166/builds/latestbuilds/couchbase-server/neo/1277/couchbase-server-enterprise-7.1.0-1277-centos7.x86_64.rpm is live 2021-09-14 02:15:36,351 - root - INFO - Trying to check is this url alive: http://172.23.126.166/builds/latestbuilds/couchbase-server/neo/1277/couchbase-server-enterprise-7.1.0-1277-centos7.x86_64.rpm 2021-09-14 02:15:36,352 - root - INFO - This url http://172.23.126.166/builds/latestbuilds/couchbase-server/neo/1277/couchbase-server-enterprise-7.1.0-1277-centos7.x86_64.rpm is live 2021-09-14 02:15:36,352 - root - INFO - Trying to check is this url alive: http://172.23.126.166/builds/latestbuilds/couchbase-server/neo/1277/couchbase-server-enterprise-7.1.0-1277-centos7.x86_64.rpm 2021-09-14 02:15:36,353 - root - INFO - This url http://172.23.126.166/builds/latestbuilds/couchbase-server/neo/1277/couchbase-server-enterprise-7.1.0-1277-centos7.x86_64.rpm is live 2021-09-14 02:15:36,353 - root - INFO - Trying to check is this url alive: http://172.23.126.166/builds/latestbuilds/couchbase-server/neo/1277/couchbase-server-enterprise-7.1.0-1277-centos7.x86_64.rpm 2021-09-14 02:15:36,354 - root - INFO - This url http://172.23.126.166/builds/latestbuilds/couchbase-server/neo/1277/couchbase-server-enterprise-7.1.0-1277-centos7.x86_64.rpm is live 2021-09-14 02:15:36,355 - root - INFO - Trying to check is this url alive: http://172.23.126.166/builds/latestbuilds/couchbase-server/neo/1277/couchbase-server-enterprise-7.1.0-1277-centos7.x86_64.rpm 2021-09-14 02:15:36,356 - root - INFO - This url http://172.23.126.166/builds/latestbuilds/couchbase-server/neo/1277/couchbase-server-enterprise-7.1.0-1277-centos7.x86_64.rpm is live 2021-09-14 02:15:40,710 - root - INFO - Done with uninstall on 172.23.121.221. 2021-09-14 02:15:41,101 - root - INFO - Done with uninstall on 172.23.106.8. 2021-09-14 02:15:41,517 - root - INFO - Done with uninstall on 172.23.121.222. 2021-09-14 02:15:42,291 - root - INFO - Done with uninstall on 172.23.123.119. 2021-09-14 02:15:46,911 - root - INFO - Done with uninstall on 172.23.106.45. 2021-09-14 02:15:56,405 - root - INFO - running command.raw on 172.23.123.119: cd /tmp/ && wc -c couchbase-server-enterprise-7.1.0-1277-centos7.x86_64.rpm 2021-09-14 02:15:56,457 - root - INFO - command executed successfully with root 2021-09-14 02:15:56,534 - root - INFO - running command.raw on 172.23.121.222: cd /tmp/ && wc -c couchbase-server-enterprise-7.1.0-1277-centos7.x86_64.rpm 2021-09-14 02:15:56,586 - root - INFO - command executed successfully with root 2021-09-14 02:15:56,659 - root - INFO - running command.raw on 172.23.121.221: cd /tmp/ && wc -c couchbase-server-enterprise-7.1.0-1277-centos7.x86_64.rpm 2021-09-14 02:15:56,710 - root - INFO - command executed successfully with root 2021-09-14 02:15:56,786 - root - INFO - running command.raw on 172.23.106.8: cd /tmp/ && wc -c couchbase-server-enterprise-7.1.0-1277-centos7.x86_64.rpm 2021-09-14 02:15:56,839 - root - INFO - command executed successfully with root 2021-09-14 02:15:56,948 - root - INFO - running command.raw on 172.23.106.45: cd /tmp/ && wc -c couchbase-server-enterprise-7.1.0-1277-centos7.x86_64.rpm 2021-09-14 02:15:56,986 - root - INFO - command executed successfully with root 2021-09-14 02:17:27,794 - root - INFO - Done with install on 172.23.123.119. 2021-09-14 02:17:28,143 - root - INFO - Done with install on 172.23.121.221. 2021-09-14 02:17:28,545 - root - INFO - Done with install on 172.23.121.222. 2021-09-14 02:17:31,616 - root - INFO - Done with install on 172.23.106.8. 2021-09-14 02:18:27,854 - root - INFO - running command.raw on 172.23.123.119: /opt/couchbase/bin/couchbase-cli node-init -c 172.23.123.119 -u Administrator -p password > /dev/null && echo 1 || echo 0; 2021-09-14 02:18:28,131 - root - INFO - command executed successfully with root 2021-09-14 02:18:28,137 - root - INFO - running command.raw on 172.23.123.119: rm -rf /data/* 2021-09-14 02:18:28,199 - root - INFO - running command.raw on 172.23.121.221: /opt/couchbase/bin/couchbase-cli node-init -c 172.23.121.221 -u Administrator -p password > /dev/null && echo 1 || echo 0; 2021-09-14 02:18:28,364 - root - INFO - command executed successfully with root 2021-09-14 02:18:28,364 - root - INFO - running command.raw on 172.23.123.119: chown -R couchbase:couchbase /data 2021-09-14 02:18:28,421 - root - INFO - command executed successfully with root 2021-09-14 02:18:28,421 - root - INFO - /nodes/self/controller/settings : path=%2Fdata 2021-09-14 02:18:28,470 - root - INFO - command executed successfully with root 2021-09-14 02:18:28,475 - root - INFO - running command.raw on 172.23.121.221: rm -rf /data/* 2021-09-14 02:18:28,488 - root - INFO - command executed successfully with root 2021-09-14 02:18:28,488 - root - INFO - running command.raw on 172.23.121.221: chown -R couchbase:couchbase /data 2021-09-14 02:18:28,544 - root - INFO - command executed successfully with root 2021-09-14 02:18:28,544 - root - INFO - /nodes/self/controller/settings : path=%2Fdata 2021-09-14 02:18:28,586 - root - INFO - running command.raw on 172.23.121.222: /opt/couchbase/bin/couchbase-cli node-init -c 172.23.121.222 -u Administrator -p password > /dev/null && echo 1 || echo 0; 2021-09-14 02:18:28,864 - root - INFO - command executed successfully with root 2021-09-14 02:18:28,869 - root - INFO - running command.raw on 172.23.121.222: rm -rf /data/* 2021-09-14 02:18:28,882 - root - INFO - command executed successfully with root 2021-09-14 02:18:28,882 - root - INFO - running command.raw on 172.23.121.222: chown -R couchbase:couchbase /data 2021-09-14 02:18:28,938 - root - INFO - command executed successfully with root 2021-09-14 02:18:28,938 - root - INFO - /nodes/self/controller/settings : path=%2Fdata 2021-09-14 02:18:31,074 - root - INFO - Setting data_path: /data: status True 2021-09-14 02:18:31,143 - root - INFO - Setting data_path: /data: status True 2021-09-14 02:18:31,528 - root - INFO - Setting data_path: /data: status True 2021-09-14 02:18:31,617 - root - INFO - running command.raw on 172.23.106.8: /opt/couchbase/bin/couchbase-cli node-init -c 172.23.106.8 -u Administrator -p password > /dev/null && echo 1 || echo 0; 2021-09-14 02:18:31,906 - root - INFO - command executed successfully with root 2021-09-14 02:18:31,912 - root - INFO - running command.raw on 172.23.106.8: rm -rf /data/* 2021-09-14 02:18:31,926 - root - INFO - command executed successfully with root 2021-09-14 02:18:31,926 - root - INFO - running command.raw on 172.23.106.8: chown -R couchbase:couchbase /data 2021-09-14 02:18:31,987 - root - INFO - command executed successfully with root 2021-09-14 02:18:31,987 - root - INFO - /nodes/self/controller/settings : path=%2Fdata 2021-09-14 02:18:32,084 - root - INFO - Setting KV memory quota as 2684 MB on 172.23.123.119 2021-09-14 02:18:32,085 - root - INFO - pools/default params : memoryQuota=2684 2021-09-14 02:18:32,088 - root - INFO - --> init_node_services(Administrator,password,None,8091,['kv']) 2021-09-14 02:18:32,088 - root - INFO - /node/controller/setupServices params on 172.23.123.119: 8091:hostname=None&user=Administrator&password=password&services=kv 2021-09-14 02:18:32,117 - root - INFO - --> in init_cluster...Administrator,password,8091 2021-09-14 02:18:32,117 - root - INFO - settings/web params on 172.23.123.119:8091:port=8091&username=Administrator&password=password 2021-09-14 02:18:32,148 - root - INFO - Setting KV memory quota as 2684 MB on 172.23.121.221 2021-09-14 02:18:32,148 - root - INFO - pools/default params : memoryQuota=2684 2021-09-14 02:18:32,152 - root - INFO - --> init_node_services(Administrator,password,None,8091,['kv']) 2021-09-14 02:18:32,152 - root - INFO - /node/controller/setupServices params on 172.23.121.221: 8091:hostname=None&user=Administrator&password=password&services=kv 2021-09-14 02:18:32,178 - root - INFO - --> status:True 2021-09-14 02:18:32,178 - root - INFO - Done with init on 172.23.123.119. 2021-09-14 02:18:32,181 - root - INFO - --> in init_cluster...Administrator,password,8091 2021-09-14 02:18:32,181 - root - INFO - settings/web params on 172.23.121.221:8091:port=8091&username=Administrator&password=password 2021-09-14 02:18:32,197 - root - INFO - Done with cleanup on 172.23.123.119. 2021-09-14 02:18:32,242 - root - INFO - --> status:True 2021-09-14 02:18:32,242 - root - INFO - Done with init on 172.23.121.221. 2021-09-14 02:18:32,265 - root - INFO - Done with cleanup on 172.23.121.221. 2021-09-14 02:18:32,533 - root - INFO - Setting KV memory quota as 2684 MB on 172.23.121.222 2021-09-14 02:18:32,533 - root - INFO - pools/default params : memoryQuota=2684 2021-09-14 02:18:32,537 - root - INFO - --> init_node_services(Administrator,password,None,8091,['kv']) 2021-09-14 02:18:32,537 - root - INFO - /node/controller/setupServices params on 172.23.121.222: 8091:hostname=None&user=Administrator&password=password&services=kv 2021-09-14 02:18:32,567 - root - INFO - --> in init_cluster...Administrator,password,8091 2021-09-14 02:18:32,567 - root - INFO - settings/web params on 172.23.121.222:8091:port=8091&username=Administrator&password=password 2021-09-14 02:18:32,628 - root - INFO - --> status:True 2021-09-14 02:18:32,628 - root - INFO - Done with init on 172.23.121.222. 2021-09-14 02:18:32,647 - root - INFO - Done with cleanup on 172.23.121.222. 2021-09-14 02:18:34,671 - root - INFO - Setting data_path: /data: status True 2021-09-14 02:18:35,710 - root - INFO - Setting KV memory quota as 2147 MB on 172.23.106.8 2021-09-14 02:18:35,711 - root - INFO - pools/default params : memoryQuota=2147 2021-09-14 02:18:35,715 - root - INFO - --> init_node_services(Administrator,password,None,8091,['kv']) 2021-09-14 02:18:35,715 - root - INFO - /node/controller/setupServices params on 172.23.106.8: 8091:hostname=None&user=Administrator&password=password&services=kv 2021-09-14 02:18:35,747 - root - INFO - --> in init_cluster...Administrator,password,8091 2021-09-14 02:18:35,747 - root - INFO - settings/web params on 172.23.106.8:8091:port=8091&username=Administrator&password=password 2021-09-14 02:18:35,812 - root - INFO - --> status:True 2021-09-14 02:18:35,812 - root - INFO - Done with init on 172.23.106.8. 2021-09-14 02:18:35,841 - root - INFO - Done with cleanup on 172.23.106.8. 2021-09-14 02:18:43,525 - root - INFO - Done with install on 172.23.106.45. 2021-09-14 02:19:43,585 - root - INFO - running command.raw on 172.23.106.45: /opt/couchbase/bin/couchbase-cli node-init -c 172.23.106.45 -u Administrator -p password > /dev/null && echo 1 || echo 0; 2021-09-14 02:19:44,275 - root - INFO - command executed successfully with root 2021-09-14 02:19:44,285 - root - INFO - running command.raw on 172.23.106.45: rm -rf /data/* 2021-09-14 02:19:44,316 - root - INFO - command executed successfully with root 2021-09-14 02:19:44,317 - root - INFO - running command.raw on 172.23.106.45: chown -R couchbase:couchbase /data 2021-09-14 02:19:44,392 - root - INFO - command executed successfully with root 2021-09-14 02:19:44,392 - root - INFO - /nodes/self/controller/settings : path=%2Fdata 2021-09-14 02:19:50,651 - root - INFO - Setting data_path: /data: status True 2021-09-14 02:19:51,660 - root - INFO - Setting KV memory quota as 2148 MB on 172.23.106.45 2021-09-14 02:19:51,660 - root - INFO - pools/default params : memoryQuota=2148 2021-09-14 02:19:51,668 - root - INFO - --> init_node_services(Administrator,password,None,8091,['kv']) 2021-09-14 02:19:51,668 - root - INFO - /node/controller/setupServices params on 172.23.106.45: 8091:hostname=None&user=Administrator&password=password&services=kv 2021-09-14 02:19:51,707 - root - INFO - --> in init_cluster...Administrator,password,8091 2021-09-14 02:19:51,707 - root - INFO - settings/web params on 172.23.106.45:8091:port=8091&username=Administrator&password=password 2021-09-14 02:19:51,863 - root - INFO - --> status:True 2021-09-14 02:19:51,863 - root - INFO - Done with init on 172.23.106.45. 2021-09-14 02:19:51,915 - root - INFO - Done with cleanup on 172.23.106.45. 2021-09-14 02:19:57,255 - root - INFO - ---------------------------------------------------------------------------------------------------- 2021-09-14 02:19:57,285 - root - INFO - cluster:C1 node:172.23.123.119:8091 version:7.1.0-1277-enterprise aFamily:inet services:['kv'] 2021-09-14 02:19:57,285 - root - INFO - cluster:C2 node:172.23.121.222:8091 version:7.1.0-1277-enterprise aFamily:inet services:['kv'] 2021-09-14 02:19:57,285 - root - INFO - cluster:C3 node:172.23.121.221:8091 version:7.1.0-1277-enterprise aFamily:inet services:['kv'] 2021-09-14 02:19:57,285 - root - INFO - cluster:C4 node:172.23.106.8:8091 version:7.1.0-1277-enterprise aFamily:inet services:['kv'] 2021-09-14 02:19:57,285 - root - INFO - cluster:C5 node:172.23.106.45:8091 version:7.1.0-1277-enterprise aFamily:inet services:['kv'] 2021-09-14 02:19:57,285 - root - INFO - ---------------------------------------------------------------------------------------------------- 2021-09-14 02:19:57,285 - root - INFO - ---------------------------------------------------------------------------------------------------- 2021-09-14 02:19:57,285 - root - INFO - ---------------------------------------------------------------------------------------------------- 2021-09-14 02:19:57,285 - root - INFO - INSTALL COMPLETED ON: 172.23.123.119 2021-09-14 02:19:57,285 - root - INFO - INSTALL COMPLETED ON: 172.23.121.222 2021-09-14 02:19:57,285 - root - INFO - INSTALL COMPLETED ON: 172.23.121.221 2021-09-14 02:19:57,285 - root - INFO - INSTALL COMPLETED ON: 172.23.106.8 2021-09-14 02:19:57,285 - root - INFO - INSTALL COMPLETED ON: 172.23.106.45 2021-09-14 02:19:57,285 - root - INFO - ---------------------------------------------------------------------------------------------------- 2021-09-14 02:19:57,286 - root - INFO - TOTAL INSTALL TIME = 269 seconds Need to set ALLOW_HTP back to True to do git pull branch Switched to a new branch 'master' Branch master set up to track remote branch master from origin. From git://github.com/couchbaselabs/TAF * branch master -> FETCH_HEAD Already up-to-date. Set ALLOW_HTP to False so test could run. HEAD is now at 04ebae7 Adding functional tests for mutate same key case find: ‘/data/workspace/centos-p0-collections-vset00-00-bucket_warmup_7.0_P0/logs/testrunner-21-Aug-14_01-47-42’: No such file or directory find: ‘/data/workspace/centos-p0-query-vset00-00-auditing_filtering/logs/testrunner-21-Aug-14_02-12-07’: No such file or directory find: ‘/root/workspace/*/logs/*’: No such file or directory find: ‘/root/workspace/’: No such file or directory python: no process found jython: no process found Timeout: 1200 minutes > Configure project : Executing 'gradle clean' Using Transaction_client :: 1.1.8 Using Java_client :: 3.1.6 Running: /opt/jython/bin/jython -J-cp /data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/build/classes/java/main:/data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/build/resources/main:/root/.gradle/caches/modules-2/files-2.1/com.jcraft/jsch/0.1.54/da3584329a263616e277e15462b387addd1b208d/jsch-0.1.54.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-slf4j-impl/2.11.1/4b41b53a3a2d299ce381a69d165381ca19f62912/log4j-slf4j-impl-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/commons-cli/commons-cli/1.4/c51c00206bb913cd8612b24abd9fa98ae89719b1/commons-cli-1.4.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/couchbase-transactions/1.1.8/e69a6013e59f498f76671c1acc43df14f1163180/couchbase-transactions-1.1.8.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/java-client/3.1.6/f065d71963e08bd5577838592e33f2ca35f5d64a/java-client-3.1.6.jar:/root/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-api/1.7.25/da76ca59f6a57ee3102f8f9bd9cee742973efa8a/slf4j-api-1.7.25.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-core/2.11.1/592a48674c926b01a9a747c7831bcd82a9e6d6e4/log4j-core-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-api/2.11.1/268f0fe4df3eefe052b57c87ec48517d64fb2a10/log4j-api-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/core-io/2.1.6/b3ece73ab7069b1e97669500edc448163c2f4304/core-io-2.1.6.jar:/root/.gradle/caches/modules-2/files-2.1/io.projectreactor/reactor-core/3.4.6/d8d52418db9eea651d4772a619d5c9ea820449b7/reactor-core-3.4.6.jar:/root/.gradle/caches/modules-2/files-2.1/org.reactivestreams/reactive-streams/1.0.3/d9fb7a7926ffa635b3dcaa5049fb2bfa25b3e7d0/reactive-streams-1.0.3.jar scripts/ssh.py -i /tmp/testexec.38389.ini -c conf/collections/collections_failover_dgm.conf -p GROUP=P0_failover_and_recovery_dgm,rerun=False,get-cbcollect-info=True,infra_log_level=critical,log_level=error,bucket_storage=magma,enable_dp=True,upgrade_version=7.1.0-1277 -m rest Running: /opt/jython/bin/jython -J-cp /data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/build/classes/java/main:/data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/build/resources/main:/root/.gradle/caches/modules-2/files-2.1/com.jcraft/jsch/0.1.54/da3584329a263616e277e15462b387addd1b208d/jsch-0.1.54.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-slf4j-impl/2.11.1/4b41b53a3a2d299ce381a69d165381ca19f62912/log4j-slf4j-impl-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/commons-cli/commons-cli/1.4/c51c00206bb913cd8612b24abd9fa98ae89719b1/commons-cli-1.4.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/couchbase-transactions/1.1.8/e69a6013e59f498f76671c1acc43df14f1163180/couchbase-transactions-1.1.8.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/java-client/3.1.6/f065d71963e08bd5577838592e33f2ca35f5d64a/java-client-3.1.6.jar:/root/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-api/1.7.25/da76ca59f6a57ee3102f8f9bd9cee742973efa8a/slf4j-api-1.7.25.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-core/2.11.1/592a48674c926b01a9a747c7831bcd82a9e6d6e4/log4j-core-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-api/2.11.1/268f0fe4df3eefe052b57c87ec48517d64fb2a10/log4j-api-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/core-io/2.1.6/b3ece73ab7069b1e97669500edc448163c2f4304/core-io-2.1.6.jar:/root/.gradle/caches/modules-2/files-2.1/io.projectreactor/reactor-core/3.4.6/d8d52418db9eea651d4772a619d5c9ea820449b7/reactor-core-3.4.6.jar:/root/.gradle/caches/modules-2/files-2.1/org.reactivestreams/reactive-streams/1.0.3/d9fb7a7926ffa635b3dcaa5049fb2bfa25b3e7d0/reactive-streams-1.0.3.jar scripts/eagles_all_around.py -i /tmp/testexec.38389.ini -c conf/collections/collections_failover_dgm.conf -p GROUP=P0_failover_and_recovery_dgm,rerun=False,get-cbcollect-info=True,infra_log_level=critical,log_level=error,bucket_storage=magma,enable_dp=True,upgrade_version=7.1.0-1277 -m rest Running: /opt/jython/bin/jython -J-cp /data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/build/classes/java/main:/data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/build/resources/main:/root/.gradle/caches/modules-2/files-2.1/com.jcraft/jsch/0.1.54/da3584329a263616e277e15462b387addd1b208d/jsch-0.1.54.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-slf4j-impl/2.11.1/4b41b53a3a2d299ce381a69d165381ca19f62912/log4j-slf4j-impl-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/commons-cli/commons-cli/1.4/c51c00206bb913cd8612b24abd9fa98ae89719b1/commons-cli-1.4.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/couchbase-transactions/1.1.8/e69a6013e59f498f76671c1acc43df14f1163180/couchbase-transactions-1.1.8.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/java-client/3.1.6/f065d71963e08bd5577838592e33f2ca35f5d64a/java-client-3.1.6.jar:/root/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-api/1.7.25/da76ca59f6a57ee3102f8f9bd9cee742973efa8a/slf4j-api-1.7.25.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-core/2.11.1/592a48674c926b01a9a747c7831bcd82a9e6d6e4/log4j-core-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-api/2.11.1/268f0fe4df3eefe052b57c87ec48517d64fb2a10/log4j-api-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/core-io/2.1.6/b3ece73ab7069b1e97669500edc448163c2f4304/core-io-2.1.6.jar:/root/.gradle/caches/modules-2/files-2.1/io.projectreactor/reactor-core/3.4.6/d8d52418db9eea651d4772a619d5c9ea820449b7/reactor-core-3.4.6.jar:/root/.gradle/caches/modules-2/files-2.1/org.reactivestreams/reactive-streams/1.0.3/d9fb7a7926ffa635b3dcaa5049fb2bfa25b3e7d0/reactive-streams-1.0.3.jar:build/classes/java/main scripts/install.py -i /tmp/testexec.38389.ini -c conf/collections/collections_failover_dgm.conf -p GROUP=P0_failover_and_recovery_dgm,rerun=False,get-cbcollect-info=True,infra_log_level=critical,log_level=error,bucket_storage=magma,enable_dp=True,upgrade_version=7.1.0-1277 -m rest Running: /opt/jython/bin/jython -J-cp /data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/build/classes/java/main:/data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/build/resources/main:/root/.gradle/caches/modules-2/files-2.1/com.jcraft/jsch/0.1.54/da3584329a263616e277e15462b387addd1b208d/jsch-0.1.54.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-slf4j-impl/2.11.1/4b41b53a3a2d299ce381a69d165381ca19f62912/log4j-slf4j-impl-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/commons-cli/commons-cli/1.4/c51c00206bb913cd8612b24abd9fa98ae89719b1/commons-cli-1.4.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/couchbase-transactions/1.1.8/e69a6013e59f498f76671c1acc43df14f1163180/couchbase-transactions-1.1.8.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/java-client/3.1.6/f065d71963e08bd5577838592e33f2ca35f5d64a/java-client-3.1.6.jar:/root/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-api/1.7.25/da76ca59f6a57ee3102f8f9bd9cee742973efa8a/slf4j-api-1.7.25.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-core/2.11.1/592a48674c926b01a9a747c7831bcd82a9e6d6e4/log4j-core-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-api/2.11.1/268f0fe4df3eefe052b57c87ec48517d64fb2a10/log4j-api-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/core-io/2.1.6/b3ece73ab7069b1e97669500edc448163c2f4304/core-io-2.1.6.jar:/root/.gradle/caches/modules-2/files-2.1/io.projectreactor/reactor-core/3.4.6/d8d52418db9eea651d4772a619d5c9ea820449b7/reactor-core-3.4.6.jar:/root/.gradle/caches/modules-2/files-2.1/org.reactivestreams/reactive-streams/1.0.3/d9fb7a7926ffa635b3dcaa5049fb2bfa25b3e7d0/reactive-streams-1.0.3.jar:build/classes/java/main:src/main/resources testrunner.py -i /tmp/testexec.38389.ini -c conf/collections/collections_failover_dgm.conf -p GROUP=P0_failover_and_recovery_dgm,rerun=False,get-cbcollect-info=True,infra_log_level=critical,log_level=error,bucket_storage=magma,enable_dp=True,upgrade_version=7.1.0-1277 -m rest Running: /opt/jython/bin/jython -J-cp /data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/build/classes/java/main:/data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/build/resources/main:/root/.gradle/caches/modules-2/files-2.1/com.jcraft/jsch/0.1.54/da3584329a263616e277e15462b387addd1b208d/jsch-0.1.54.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-slf4j-impl/2.11.1/4b41b53a3a2d299ce381a69d165381ca19f62912/log4j-slf4j-impl-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/commons-cli/commons-cli/1.4/c51c00206bb913cd8612b24abd9fa98ae89719b1/commons-cli-1.4.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/couchbase-transactions/1.1.8/e69a6013e59f498f76671c1acc43df14f1163180/couchbase-transactions-1.1.8.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/java-client/3.1.6/f065d71963e08bd5577838592e33f2ca35f5d64a/java-client-3.1.6.jar:/root/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-api/1.7.25/da76ca59f6a57ee3102f8f9bd9cee742973efa8a/slf4j-api-1.7.25.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-core/2.11.1/592a48674c926b01a9a747c7831bcd82a9e6d6e4/log4j-core-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-api/2.11.1/268f0fe4df3eefe052b57c87ec48517d64fb2a10/log4j-api-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/core-io/2.1.6/b3ece73ab7069b1e97669500edc448163c2f4304/core-io-2.1.6.jar:/root/.gradle/caches/modules-2/files-2.1/io.projectreactor/reactor-core/3.4.6/d8d52418db9eea651d4772a619d5c9ea820449b7/reactor-core-3.4.6.jar:/root/.gradle/caches/modules-2/files-2.1/org.reactivestreams/reactive-streams/1.0.3/d9fb7a7926ffa635b3dcaa5049fb2bfa25b3e7d0/reactive-streams-1.0.3.jar:build/classes/java/main:src/main/resources scripts/rerun_jobs.py -i /tmp/testexec.38389.ini -c conf/collections/collections_failover_dgm.conf -p GROUP=P0_failover_and_recovery_dgm,rerun=False,get-cbcollect-info=True,infra_log_level=critical,log_level=error,bucket_storage=magma,enable_dp=True,upgrade_version=7.1.0-1277 -m rest > Task :compileJava warning: unknown enum constant javax.annotation.meta.When.MAYBE reason: class file for javax.annotation.meta.When not found Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. 1 warning > Task :testrunner Filename: conf/collections/collections_failover_dgm.conf Prefix: bucket_collections.collections_rebalance.CollectionsRebalance Global Test input params: {'GROUP': 'P0_failover_and_recovery_dgm', 'bucket_storage': 'magma', 'cluster_name': 'testexec.38389', 'conf_file': 'conf/collections/collections_failover_dgm.conf', 'enable_dp': 'True', 'get-cbcollect-info': 'True', 'infra_log_level': 'critical', 'ini': '/tmp/testexec.38389.ini', 'log_level': 'error', 'num_nodes': 5, 'rerun': 'False', 'spec': 'collections_failover_dgm', 'upgrade_version': '7.1.0-1277'} Only cases in GROUPs 'P0_failover_and_recovery_dgm' will be executed Test 'bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_graceful_failover_rebalance_out,nodes_init=3,nodes_failover=1,bucket_spec=dgm.buckets_for_rebalance_tests,data_load_stage=during,dgm=40,skip_validations=False,GROUP=P0_failover_and_rebalance_out_dgm' skipped, GROUP not satisfied Test 'bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_hard_failover_rebalance_out,nodes_init=3,nodes_failover=1,bucket_spec=dgm.buckets_for_rebalance_tests,data_load_stage=during,dgm=40,skip_validations=False,GROUP=P0_failover_and_rebalance_out_dgm' skipped, GROUP not satisfied Logs will be stored at /data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/logs/testrunner-21-Sep-14_02-20-11/test_1 guides/gradlew --refresh-dependencies testrunner -P jython=/opt/jython/bin/jython -P 'args=-i /tmp/testexec.38389.ini GROUP=P0_failover_and_recovery_dgm,rerun=False,get-cbcollect-info=True,infra_log_level=critical,log_level=error,bucket_storage=magma,enable_dp=True,upgrade_version=7.1.0-1277 -t bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_graceful_failover_recovery,nodes_init=3,nodes_failover=1,recovery_type=full,bucket_spec=dgm.buckets_for_rebalance_tests,data_load_stage=during,dgm=40,skip_validations=False,GROUP=P0_failover_and_recovery_dgm' Test Input params: {'data_load_stage': 'during', 'conf_file': 'conf/collections/collections_failover_dgm.conf', 'upgrade_version': '7.1.0-1277', 'dgm': '40', 'spec': 'collections_failover_dgm', 'rerun': 'False', 'num_nodes': 5, 'GROUP': 'P0_failover_and_recovery_dgm', 'enable_dp': 'True', 'bucket_spec': 'dgm.buckets_for_rebalance_tests', 'case_number': 1, 'cluster_name': 'testexec.38389', 'nodes_failover': '1', 'ini': '/tmp/testexec.38389.ini', 'get-cbcollect-info': 'True', 'recovery_type': 'full', 'log_level': 'error', 'bucket_storage': 'magma', 'skip_validations': 'False', 'logs_folder': '/data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/logs/testrunner-21-Sep-14_02-20-11/test_1', 'nodes_init': '3', 'infra_log_level': 'critical'} test_data_load_collections_with_graceful_failover_recovery (bucket_collections.collections_rebalance.CollectionsRebalance) ... Got this failure java.net.ConnectException: Connection refused: /172.23.123.119:8091 during connect (<_realsocket at 0x10 type=client open_count=1 channel=[id: 0x1d5c6eaf, 0.0.0.0/0.0.0.0:41603] timeout=300.0>) 2021-09-14 02:20:22,217 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.123.119:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.123.119:8091 during connect (<_realsocket at 0x11 type=client open_count=1 channel=[id: 0x7c1dd717, 0.0.0.0/0.0.0.0:43501] timeout=300.0>) 2021-09-14 02:20:24,654 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.123.119:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.123.119:8091 during connect (<_realsocket at 0x12 type=client open_count=1 channel=[id: 0x1782610e, 0.0.0.0/0.0.0.0:33086] timeout=300.0>) 2021-09-14 02:20:26,677 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.123.119:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.123.119:8091 during connect (<_realsocket at 0x13 type=client open_count=1 channel=[id: 0x36a43317, 0.0.0.0/0.0.0.0:35572] timeout=300.0>) 2021-09-14 02:20:28,697 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.123.119:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.222:8091 during connect (<_realsocket at 0x14 type=client open_count=1 channel=[id: 0xe8fe2f88, 0.0.0.0/0.0.0.0:35195] timeout=300.0>) 2021-09-14 02:20:34,786 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.222:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.222:8091 during connect (<_realsocket at 0x15 type=client open_count=1 channel=[id: 0xd6c6a886, 0.0.0.0/0.0.0.0:60468] timeout=300.0>) 2021-09-14 02:20:36,805 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.222:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.222:8091 during connect (<_realsocket at 0x16 type=client open_count=1 channel=[id: 0xe6ccb894, 0.0.0.0/0.0.0.0:60256] timeout=300.0>) 2021-09-14 02:20:38,825 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.222:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.222:8091 during connect (<_realsocket at 0x17 type=client open_count=1 channel=[id: 0x72bc3f53, 0.0.0.0/0.0.0.0:51284] timeout=300.0>) 2021-09-14 02:20:40,841 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.222:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.222:8091 during connect (<_realsocket at 0x18 type=client open_count=1 channel=[id: 0x451b1878, 0.0.0.0/0.0.0.0:33414] timeout=300.0>) 2021-09-14 02:20:42,858 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.222:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.221:8091 during connect (<_realsocket at 0x19 type=client open_count=1 channel=[id: 0xadbaa92c, 0.0.0.0/0.0.0.0:49120] timeout=300.0>) 2021-09-14 02:20:48,377 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.221:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.221:8091 during connect (<_realsocket at 0x1a type=client open_count=1 channel=[id: 0x5c924ed0, 0.0.0.0/0.0.0.0:58205] timeout=300.0>) 2021-09-14 02:20:50,392 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.221:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.221:8091 during connect (<_realsocket at 0x1b type=client open_count=1 channel=[id: 0x0de6b6f6, 0.0.0.0/0.0.0.0:36926] timeout=300.0>) 2021-09-14 02:20:52,411 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.221:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.221:8091 during connect (<_realsocket at 0x1c type=client open_count=1 channel=[id: 0x4160b447, 0.0.0.0/0.0.0.0:45896] timeout=300.0>) 2021-09-14 02:20:54,427 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.221:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.8:8091 during connect (<_realsocket at 0x1d type=client open_count=1 channel=[id: 0x8bcb19fb, 0.0.0.0/0.0.0.0:45061] timeout=300.0>) 2021-09-14 02:21:00,946 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.8:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.8:8091 during connect (<_realsocket at 0x1e type=client open_count=1 channel=[id: 0x016ba517, 0.0.0.0/0.0.0.0:35976] timeout=300.0>) 2021-09-14 02:21:02,961 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.8:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.8:8091 during connect (<_realsocket at 0x1f type=client open_count=1 channel=[id: 0xba7f000a, 0.0.0.0/0.0.0.0:48537] timeout=300.0>) 2021-09-14 02:21:04,979 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.8:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.8:8091 during connect (<_realsocket at 0x20 type=client open_count=1 channel=[id: 0xe4842419, 0.0.0.0/0.0.0.0:46752] timeout=300.0>) 2021-09-14 02:21:06,993 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.8:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.8:8091 during connect (<_realsocket at 0x21 type=client open_count=1 channel=[id: 0x14377a44, 0.0.0.0/0.0.0.0:56253] timeout=300.0>) 2021-09-14 02:21:09,009 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.8:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0x22 type=client open_count=1 channel=[id: 0xcf3ee997, 0.0.0.0/0.0.0.0:35353] timeout=300.0>) 2021-09-14 02:21:17,267 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0x23 type=client open_count=1 channel=[id: 0xd9ba376b, 0.0.0.0/0.0.0.0:33212] timeout=300.0>) 2021-09-14 02:21:19,282 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0x24 type=client open_count=1 channel=[id: 0x942ad6dc, 0.0.0.0/0.0.0.0:42508] timeout=300.0>) 2021-09-14 02:21:21,296 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0x25 type=client open_count=1 channel=[id: 0x6f300f52, 0.0.0.0/0.0.0.0:43468] timeout=300.0>) 2021-09-14 02:21:23,312 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0x26 type=client open_count=1 channel=[id: 0xb996b47c, 0.0.0.0/0.0.0.0:41942] timeout=300.0>) 2021-09-14 02:21:25,329 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0x27 type=client open_count=1 channel=[id: 0x619255ca, 0.0.0.0/0.0.0.0:45090] timeout=300.0>) 2021-09-14 02:21:27,348 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0x28 type=client open_count=1 channel=[id: 0x4908aba3, 0.0.0.0/0.0.0.0:53503] timeout=300.0>) 2021-09-14 02:21:29,362 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0x29 type=client open_count=1 channel=[id: 0x39213c28, 0.0.0.0/0.0.0.0:49014] timeout=300.0>) 2021-09-14 02:21:31,381 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0x2a type=client open_count=1 channel=[id: 0x405bc7bf, 0.0.0.0/0.0.0.0:39025] timeout=300.0>) 2021-09-14 02:21:33,394 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0x2b type=client open_count=1 channel=[id: 0xa21c2dc8, 0.0.0.0/0.0.0.0:32953] timeout=300.0>) 2021-09-14 02:21:35,408 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0x2c type=client open_count=1 channel=[id: 0xe976341d, 0.0.0.0/0.0.0.0:44869] timeout=300.0>) 2021-09-14 02:21:37,423 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Traceback (most recent call last): File "pytests/bucket_collections/collections_base.py", line 63, in setUp self.collection_setup() File "pytests/bucket_collections/collections_base.py", line 158, in collection_setup self.create_sdk_clients(self.task_manager.number_of_threads, File "pytests/bucket_collections/collections_base.py", line 111, in create_sdk_clients sdk_client_pool.create_clients( File "lib/sdk_client3.py", line 111, in create_clients self.clients[bucket.name]["idle_clients"].append(SDKClient( File "lib/sdk_client3.py", line 241, in __init__ self.__create_conn() File "lib/sdk_client3.py", line 303, in _SDKClient__create_conn self.bucketObj.waitUntilReady( UnambiguousTimeoutException: com.couchbase.client.core.error.UnambiguousTimeoutException: WaitUntilReady timed out {"bucket":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","checkedServices":["KV"],"currentState":"ONLINE","desiredState":"ONLINE","services":{"mgmt":[{"last_activity_us":27271,"state":"connected","id":"0x1672bec9","remote":"172.23.123.119:8091","local":"172.23.123.71:55154"},{"last_activity_us":17531,"state":"connected","id":"0x3483c453","remote":"172.23.121.221:8091","local":"172.23.123.71:43962"},{"last_activity_us":7922,"state":"connected","id":"0xbc306aff","remote":"172.23.121.222:8091","local":"172.23.123.71:46268"}],"kv":[{"last_activity_us":275451,"state":"connected","id":"0xafbd46ab","remote":"172.23.123.119:11210","local":"172.23.123.71:38826"},{"last_activity_us":2280083,"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xb0dee407","remote":"172.23.123.119:11210","local":"172.23.123.71:38830"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xc123512b","remote":"172.23.123.119:11210","local":"172.23.123.71:38832"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xba798f7c","remote":"172.23.123.119:11210","local":"172.23.123.71:38828"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xdeac3b26","remote":"172.23.123.119:11210","local":"172.23.123.71:38834"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xc60f5a1a","remote":"172.23.123.119:11210","local":"172.23.123.71:38836"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x154e0536","remote":"172.23.123.119:11210","local":"172.23.123.71:38846"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x9dbe1be4","remote":"172.23.123.119:11210","local":"172.23.123.71:38844"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xd393f829","remote":"172.23.123.119:11210","local":"172.23.123.71:38838"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x8bfdcc5b","remote":"172.23.123.119:11210","local":"172.23.123.71:38860"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x26aef6a6","remote":"172.23.123.119:11210","local":"172.23.123.71:38841"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x167c6ad8","remote":"172.23.123.119:11210","local":"172.23.123.71:38840"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xb6aac39d","remote":"172.23.123.119:11210","local":"172.23.123.71:38864"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xb8091e13","remote":"172.23.123.119:11210","local":"172.23.123.71:38854"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xc7753219","remote":"172.23.123.119:11210","local":"172.23.123.71:38850"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x3d6421d2","remote":"172.23.123.119:11210","local":"172.23.123.71:38868"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xfa5f04f6","remote":"172.23.123.119:11210","local":"172.23.123.71:38852"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xaf235666","remote":"172.23.123.119:11210","local":"172.23.123.71:38848"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x00308b9e","remote":"172.23.123.119:11210","local":"172.23.123.71:38872"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xdcceb55a","remote":"172.23.123.119:11210","local":"172.23.123.71:38862"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x4a3ad0b4","remote":"172.23.123.119:11210","local":"172.23.123.71:38858"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x7bad521a","remote":"172.23.123.119:11210","local":"172.23.123.71:38876"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xfeb60987","remote":"172.23.123.119:11210","local":"172.23.123.71:38870"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x3a37cf10","remote":"172.23.123.119:11210","local":"172.23.123.71:38856"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x848e9b5a","remote":"172.23.123.119:11210","local":"172.23.123.71:38874"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xd464285d","remote":"172.23.123.119:11210","local":"172.23.123.71:38866"},{"last_activity_us":276863,"state":"connected","id":"0xcc2f9a67","remote":"172.23.121.221:11210","local":"172.23.123.71:47240"},{"last_activity_us":2280329,"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xca44bb1b","remote":"172.23.121.221:11210","local":"172.23.123.71:47244"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xf6b7eb6c","remote":"172.23.121.221:11210","local":"172.23.123.71:47246"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x49ec29b0","remote":"172.23.121.221:11210","local":"172.23.123.71:47248"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xd3e36212","remote":"172.23.121.221:11210","local":"172.23.123.71:47326"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x98063be8","remote":"172.23.121.221:11210","local":"172.23.123.71:47250"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x15b0810f","remote":"172.23.121.221:11210","local":"172.23.123.71:47252"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x1d273fbf","remote":"172.23.121.221:11210","local":"172.23.123.71:47274"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x69884a0b","remote":"172.23.121.221:11210","local":"172.23.123.71:47256"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x7094f471","remote":"172.23.121.221:11210","local":"172.23.123.71:47268"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x22e3e43b","remote":"172.23.121.221:11210","local":"172.23.123.71:47328"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x3a66f5f9","remote":"172.23.121.221:11210","local":"172.23.123.71:47254"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x730334b9","remote":"172.23.121.221:11210","local":"172.23.123.71:47310"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xbf1af6ec","remote":"172.23.121.221:11210","local":"172.23.123.71:47280"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xcfe109a0","remote":"172.23.121.221:11210","local":"172.23.123.71:47258"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x8bf18046","remote":"172.23.121.221:11210","local":"172.23.123.71:47270"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x5f535df4","remote":"172.23.121.221:11210","local":"172.23.123.71:47330"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x5ff0f567","remote":"172.23.121.221:11210","local":"172.23.123.71:47292"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x40d98b45","remote":"172.23.121.221:11210","local":"172.23.123.71:47314"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x3802e1f5","remote":"172.23.121.221:11210","local":"172.23.123.71:47284"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x16085d1a","remote":"172.23.121.221:11210","local":"172.23.123.71:47260"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x01baf9be","remote":"172.23.121.221:11210","local":"172.23.123.71:47272"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x3c65ba3b","remote":"172.23.121.221:11210","local":"172.23.123.71:47332"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x6aec37ac","remote":"172.23.121.221:11210","local":"172.23.123.71:47296"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x164fd4d3","remote":"172.23.121.221:11210","local":"172.23.123.71:47316"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x728bce5e","remote":"172.23.121.221:11210","local":"172.23.123.71:47286"},{"last_activity_us":277098,"state":"connected","id":"0x8220705a","remote":"172.23.121.222:11210","local":"172.23.123.71:43122"},{"last_activity_us":2279218,"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xb9f06c04","remote":"172.23.121.222:11210","local":"172.23.123.71:43142"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x51f8e4b4","remote":"172.23.121.222:11210","local":"172.23.123.71:43156"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xb66aa607","remote":"172.23.121.222:11210","local":"172.23.123.71:43214"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x37756294","remote":"172.23.121.222:11210","local":"172.23.123.71:43178"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x56ac64a4","remote":"172.23.121.222:11210","local":"172.23.123.71:43198"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x71e13b88","remote":"172.23.121.222:11210","local":"172.23.123.71:43168"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x3d33ea12","remote":"172.23.121.222:11210","local":"172.23.123.71:43144"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x3cfeb9c7","remote":"172.23.121.222:11210","local":"172.23.123.71:43158"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xffab3d1a","remote":"172.23.121.222:11210","local":"172.23.123.71:43216"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xc082ad14","remote":"172.23.121.222:11210","local":"172.23.123.71:43180"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x48386470","remote":"172.23.121.222:11210","local":"172.23.123.71:43200"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xd5509b34","remote":"172.23.121.222:11210","local":"172.23.123.71:43170"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x2c4a7c8a","remote":"172.23.121.222:11210","local":"172.23.123.71:43146"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x7f360270","remote":"172.23.121.222:11210","local":"172.23.123.71:43162"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xaebf2d52","remote":"172.23.121.222:11210","local":"172.23.123.71:43218"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x289dbcf9","remote":"172.23.121.222:11210","local":"172.23.123.71:43182"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x56f3bb1b","remote":"172.23.121.222:11210","local":"172.23.123.71:43202"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xb13505fe","remote":"172.23.121.222:11210","local":"172.23.123.71:43174"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x8d2b5e28","remote":"172.23.121.222:11210","local":"172.23.123.71:43186"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xefaec809","remote":"172.23.121.222:11210","local":"172.23.123.71:43184"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x94a5eaba","remote":"172.23.121.222:11210","local":"172.23.123.71:43220"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xb75d4c7c","remote":"172.23.121.222:11210","local":"172.23.123.71:43222"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x7df11dbd","remote":"172.23.121.222:11210","local":"172.23.123.71:43204"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x113b6aa4","remote":"172.23.121.222:11210","local":"172.23.123.71:43192"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x8595baaa","remote":"172.23.121.222:11210","local":"172.23.123.71:43188"}]},"state":{"current_stage":"BUCKET_NODES_HEALTHY","current_stage_since_ms":22,"timings_ms":{"BUCKET_CONFIG_READY":0,"CONFIG_LOAD":424,"BUCKET_NODES_HEALTHY":10},"total_ms":109537},"timeoutMs":120000} ERROR ====================================================================== ERROR: test_data_load_collections_with_graceful_failover_recovery (bucket_collections.collections_rebalance.CollectionsRebalance) ---------------------------------------------------------------------- Traceback (most recent call last): File "pytests/bucket_collections/collections_rebalance.py", line 22, in setUp super(CollectionsRebalance, self).setUp() File "pytests/bucket_collections/collections_base.py", line 65, in setUp self.handle_setup_exception(exception) File "pytests/basetestcase.py", line 609, in handle_setup_exception raise exception_obj UnambiguousTimeoutException: com.couchbase.client.core.error.UnambiguousTimeoutException: WaitUntilReady timed out {"bucket":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","checkedServices":["KV"],"currentState":"ONLINE","desiredState":"ONLINE","services":{"mgmt":[{"last_activity_us":27271,"state":"connected","id":"0x1672bec9","remote":"172.23.123.119:8091","local":"172.23.123.71:55154"},{"last_activity_us":17531,"state":"connected","id":"0x3483c453","remote":"172.23.121.221:8091","local":"172.23.123.71:43962"},{"last_activity_us":7922,"state":"connected","id":"0xbc306aff","remote":"172.23.121.222:8091","local":"172.23.123.71:46268"}],"kv":[{"last_activity_us":275451,"state":"connected","id":"0xafbd46ab","remote":"172.23.123.119:11210","local":"172.23.123.71:38826"},{"last_activity_us":2280083,"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xb0dee407","remote":"172.23.123.119:11210","local":"172.23.123.71:38830"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xc123512b","remote":"172.23.123.119:11210","local":"172.23.123.71:38832"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xba798f7c","remote":"172.23.123.119:11210","local":"172.23.123.71:38828"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xdeac3b26","remote":"172.23.123.119:11210","local":"172.23.123.71:38834"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xc60f5a1a","remote":"172.23.123.119:11210","local":"172.23.123.71:38836"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x154e0536","remote":"172.23.123.119:11210","local":"172.23.123.71:38846"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x9dbe1be4","remote":"172.23.123.119:11210","local":"172.23.123.71:38844"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xd393f829","remote":"172.23.123.119:11210","local":"172.23.123.71:38838"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x8bfdcc5b","remote":"172.23.123.119:11210","local":"172.23.123.71:38860"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x26aef6a6","remote":"172.23.123.119:11210","local":"172.23.123.71:38841"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x167c6ad8","remote":"172.23.123.119:11210","local":"172.23.123.71:38840"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xb6aac39d","remote":"172.23.123.119:11210","local":"172.23.123.71:38864"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xb8091e13","remote":"172.23.123.119:11210","local":"172.23.123.71:38854"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xc7753219","remote":"172.23.123.119:11210","local":"172.23.123.71:38850"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x3d6421d2","remote":"172.23.123.119:11210","local":"172.23.123.71:38868"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xfa5f04f6","remote":"172.23.123.119:11210","local":"172.23.123.71:38852"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xaf235666","remote":"172.23.123.119:11210","local":"172.23.123.71:38848"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x00308b9e","remote":"172.23.123.119:11210","local":"172.23.123.71:38872"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xdcceb55a","remote":"172.23.123.119:11210","local":"172.23.123.71:38862"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x4a3ad0b4","remote":"172.23.123.119:11210","local":"172.23.123.71:38858"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x7bad521a","remote":"172.23.123.119:11210","local":"172.23.123.71:38876"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xfeb60987","remote":"172.23.123.119:11210","local":"172.23.123.71:38870"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x3a37cf10","remote":"172.23.123.119:11210","local":"172.23.123.71:38856"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x848e9b5a","remote":"172.23.123.119:11210","local":"172.23.123.71:38874"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xd464285d","remote":"172.23.123.119:11210","local":"172.23.123.71:38866"},{"last_activity_us":276863,"state":"connected","id":"0xcc2f9a67","remote":"172.23.121.221:11210","local":"172.23.123.71:47240"},{"last_activity_us":2280329,"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xca44bb1b","remote":"172.23.121.221:11210","local":"172.23.123.71:47244"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xf6b7eb6c","remote":"172.23.121.221:11210","local":"172.23.123.71:47246"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x49ec29b0","remote":"172.23.121.221:11210","local":"172.23.123.71:47248"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xd3e36212","remote":"172.23.121.221:11210","local":"172.23.123.71:47326"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x98063be8","remote":"172.23.121.221:11210","local":"172.23.123.71:47250"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x15b0810f","remote":"172.23.121.221:11210","local":"172.23.123.71:47252"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x1d273fbf","remote":"172.23.121.221:11210","local":"172.23.123.71:47274"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x69884a0b","remote":"172.23.121.221:11210","local":"172.23.123.71:47256"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x7094f471","remote":"172.23.121.221:11210","local":"172.23.123.71:47268"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x22e3e43b","remote":"172.23.121.221:11210","local":"172.23.123.71:47328"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x3a66f5f9","remote":"172.23.121.221:11210","local":"172.23.123.71:47254"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x730334b9","remote":"172.23.121.221:11210","local":"172.23.123.71:47310"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xbf1af6ec","remote":"172.23.121.221:11210","local":"172.23.123.71:47280"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xcfe109a0","remote":"172.23.121.221:11210","local":"172.23.123.71:47258"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x8bf18046","remote":"172.23.121.221:11210","local":"172.23.123.71:47270"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x5f535df4","remote":"172.23.121.221:11210","local":"172.23.123.71:47330"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x5ff0f567","remote":"172.23.121.221:11210","local":"172.23.123.71:47292"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x40d98b45","remote":"172.23.121.221:11210","local":"172.23.123.71:47314"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x3802e1f5","remote":"172.23.121.221:11210","local":"172.23.123.71:47284"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x16085d1a","remote":"172.23.121.221:11210","local":"172.23.123.71:47260"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x01baf9be","remote":"172.23.121.221:11210","local":"172.23.123.71:47272"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x3c65ba3b","remote":"172.23.121.221:11210","local":"172.23.123.71:47332"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x6aec37ac","remote":"172.23.121.221:11210","local":"172.23.123.71:47296"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x164fd4d3","remote":"172.23.121.221:11210","local":"172.23.123.71:47316"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x728bce5e","remote":"172.23.121.221:11210","local":"172.23.123.71:47286"},{"last_activity_us":277098,"state":"connected","id":"0x8220705a","remote":"172.23.121.222:11210","local":"172.23.123.71:43122"},{"last_activity_us":2279218,"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xb9f06c04","remote":"172.23.121.222:11210","local":"172.23.123.71:43142"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x51f8e4b4","remote":"172.23.121.222:11210","local":"172.23.123.71:43156"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xb66aa607","remote":"172.23.121.222:11210","local":"172.23.123.71:43214"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x37756294","remote":"172.23.121.222:11210","local":"172.23.123.71:43178"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x56ac64a4","remote":"172.23.121.222:11210","local":"172.23.123.71:43198"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x71e13b88","remote":"172.23.121.222:11210","local":"172.23.123.71:43168"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x3d33ea12","remote":"172.23.121.222:11210","local":"172.23.123.71:43144"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x3cfeb9c7","remote":"172.23.121.222:11210","local":"172.23.123.71:43158"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xffab3d1a","remote":"172.23.121.222:11210","local":"172.23.123.71:43216"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xc082ad14","remote":"172.23.121.222:11210","local":"172.23.123.71:43180"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x48386470","remote":"172.23.121.222:11210","local":"172.23.123.71:43200"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xd5509b34","remote":"172.23.121.222:11210","local":"172.23.123.71:43170"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x2c4a7c8a","remote":"172.23.121.222:11210","local":"172.23.123.71:43146"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x7f360270","remote":"172.23.121.222:11210","local":"172.23.123.71:43162"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xaebf2d52","remote":"172.23.121.222:11210","local":"172.23.123.71:43218"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x289dbcf9","remote":"172.23.121.222:11210","local":"172.23.123.71:43182"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x56f3bb1b","remote":"172.23.121.222:11210","local":"172.23.123.71:43202"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xb13505fe","remote":"172.23.121.222:11210","local":"172.23.123.71:43174"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x8d2b5e28","remote":"172.23.121.222:11210","local":"172.23.123.71:43186"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xefaec809","remote":"172.23.121.222:11210","local":"172.23.123.71:43184"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x94a5eaba","remote":"172.23.121.222:11210","local":"172.23.123.71:43220"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0xb75d4c7c","remote":"172.23.121.222:11210","local":"172.23.123.71:43222"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x7df11dbd","remote":"172.23.121.222:11210","local":"172.23.123.71:43204"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x113b6aa4","remote":"172.23.121.222:11210","local":"172.23.123.71:43192"},{"namespace":"BGl%acc7vO8b8XXUEgV9Q2LLDRi-pH6poka4fXF-5nYDMdH6dIOXPBr6NP-56-161000","state":"connected","id":"0x8595baaa","remote":"172.23.121.222:11210","local":"172.23.123.71:43188"}]},"state":{"current_stage":"BUCKET_NODES_HEALTHY","current_stage_since_ms":24,"timings_ms":{"BUCKET_CONFIG_READY":0,"CONFIG_LOAD":424,"BUCKET_NODES_HEALTHY":10},"total_ms":109539},"timeoutMs":120000} ---------------------------------------------------------------------- Ran 1 test in 298.638s FAILED (errors=1) During the test, Remote Connections: 36, Disconnections: 36 SDK Connections: 0, Disconnections: 0 summary so far suite bucket_collections.collections_rebalance.CollectionsRebalance , pass 0 , fail 1 failures so far... bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_graceful_failover_recovery testrunner logs, diags and results are available under /data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/logs/testrunner-21-Sep-14_02-20-11/test_1 Logs will be stored at /data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/logs/testrunner-21-Sep-14_02-20-11/test_2 guides/gradlew --refresh-dependencies testrunner -P jython=/opt/jython/bin/jython -P 'args=-i /tmp/testexec.38389.ini GROUP=P0_failover_and_recovery_dgm,rerun=False,get-cbcollect-info=True,infra_log_level=critical,log_level=error,bucket_storage=magma,enable_dp=True,upgrade_version=7.1.0-1277 -t bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_graceful_failover_recovery,nodes_init=3,nodes_failover=1,recovery_type=delta,bucket_spec=dgm.buckets_for_rebalance_tests,data_load_stage=during,dgm=40,skip_validations=False,GROUP=P0_failover_and_recovery_dgm' Test Input params: {'data_load_stage': 'during', 'conf_file': 'conf/collections/collections_failover_dgm.conf', 'upgrade_version': '7.1.0-1277', 'dgm': '40', 'spec': 'collections_failover_dgm', 'rerun': 'False', 'num_nodes': 5, 'GROUP': 'P0_failover_and_recovery_dgm', 'enable_dp': 'True', 'bucket_spec': 'dgm.buckets_for_rebalance_tests', 'case_number': 2, 'cluster_name': 'testexec.38389', 'nodes_failover': '1', 'ini': '/tmp/testexec.38389.ini', 'get-cbcollect-info': 'True', 'recovery_type': 'delta', 'log_level': 'error', 'bucket_storage': 'magma', 'skip_validations': 'False', 'logs_folder': '/data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/logs/testrunner-21-Sep-14_02-20-11/test_2', 'nodes_init': '3', 'infra_log_level': 'critical'} test_data_load_collections_with_graceful_failover_recovery (bucket_collections.collections_rebalance.CollectionsRebalance) ... Got this failure java.net.ConnectException: Connection refused: /172.23.123.119:8091 during connect (<_realsocket at 0xbc type=client open_count=1 channel=[id: 0x0fa599bc, 0.0.0.0/0.0.0.0:54065] timeout=300.0>) 2021-09-14 02:25:44,986 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.123.119:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.123.119:8091 during connect (<_realsocket at 0xbd type=client open_count=1 channel=[id: 0x7c9e23c5, 0.0.0.0/0.0.0.0:53486] timeout=300.0>) 2021-09-14 02:25:47,000 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.123.119:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.123.119:8091 during connect (<_realsocket at 0xbe type=client open_count=1 channel=[id: 0x6fcc7a48, 0.0.0.0/0.0.0.0:49827] timeout=300.0>) 2021-09-14 02:25:49,013 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.123.119:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.123.119:8091 during connect (<_realsocket at 0xbf type=client open_count=1 channel=[id: 0x3eeb39a6, 0.0.0.0/0.0.0.0:40730] timeout=300.0>) 2021-09-14 02:25:51,026 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.123.119:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.222:8091 during connect (<_realsocket at 0xc0 type=client open_count=1 channel=[id: 0xa60cd549, 0.0.0.0/0.0.0.0:60304] timeout=300.0>) 2021-09-14 02:25:56,994 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.222:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.222:8091 during connect (<_realsocket at 0xc1 type=client open_count=1 channel=[id: 0x3d92e52b, 0.0.0.0/0.0.0.0:55415] timeout=300.0>) 2021-09-14 02:25:59,009 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.222:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.222:8091 during connect (<_realsocket at 0xc2 type=client open_count=1 channel=[id: 0x14365394, 0.0.0.0/0.0.0.0:41007] timeout=300.0>) 2021-09-14 02:26:01,023 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.222:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.222:8091 during connect (<_realsocket at 0xc3 type=client open_count=1 channel=[id: 0x0f8f7f66, 0.0.0.0/0.0.0.0:48746] timeout=300.0>) 2021-09-14 02:26:03,035 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.222:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.221:8091 during connect (<_realsocket at 0xc4 type=client open_count=1 channel=[id: 0x6fdeb1cd, 0.0.0.0/0.0.0.0:41100] timeout=300.0>) 2021-09-14 02:26:08,391 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.221:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.221:8091 during connect (<_realsocket at 0xc5 type=client open_count=1 channel=[id: 0xb9ed79b5, 0.0.0.0/0.0.0.0:38026] timeout=300.0>) 2021-09-14 02:26:10,404 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.221:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.221:8091 during connect (<_realsocket at 0xc6 type=client open_count=1 channel=[id: 0xc28ec5c3, 0.0.0.0/0.0.0.0:48705] timeout=300.0>) 2021-09-14 02:26:12,415 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.221:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.221:8091 during connect (<_realsocket at 0xc7 type=client open_count=1 channel=[id: 0x9b85dd86, 0.0.0.0/0.0.0.0:37594] timeout=300.0>) 2021-09-14 02:26:14,427 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.221:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.8:8091 during connect (<_realsocket at 0xc8 type=client open_count=1 channel=[id: 0x0403eff3, 0.0.0.0/0.0.0.0:44054] timeout=300.0>) 2021-09-14 02:26:20,489 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.8:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.8:8091 during connect (<_realsocket at 0xc9 type=client open_count=1 channel=[id: 0x314884fa, 0.0.0.0/0.0.0.0:37259] timeout=300.0>) 2021-09-14 02:26:22,503 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.8:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.8:8091 during connect (<_realsocket at 0xca type=client open_count=1 channel=[id: 0x0f653a2d, 0.0.0.0/0.0.0.0:33872] timeout=300.0>) 2021-09-14 02:26:24,517 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.8:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.8:8091 during connect (<_realsocket at 0xcb type=client open_count=1 channel=[id: 0x5539e4fd, 0.0.0.0/0.0.0.0:42667] timeout=300.0>) 2021-09-14 02:26:26,530 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.8:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.8:8091 during connect (<_realsocket at 0xcc type=client open_count=1 channel=[id: 0xcab8843f, 0.0.0.0/0.0.0.0:57224] timeout=300.0>) 2021-09-14 02:26:28,542 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.8:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0xcd type=client open_count=1 channel=[id: 0xa582776d, 0.0.0.0/0.0.0.0:48603] timeout=300.0>) 2021-09-14 02:26:36,691 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0xce type=client open_count=1 channel=[id: 0x6fb2405d, 0.0.0.0/0.0.0.0:35050] timeout=300.0>) 2021-09-14 02:26:38,703 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0xcf type=client open_count=1 channel=[id: 0x7fab5e29, 0.0.0.0/0.0.0.0:38346] timeout=300.0>) 2021-09-14 02:26:40,714 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0xd0 type=client open_count=1 channel=[id: 0xca9edf35, 0.0.0.0/0.0.0.0:37156] timeout=300.0>) 2021-09-14 02:26:42,726 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0xd1 type=client open_count=1 channel=[id: 0x20c37fb2, 0.0.0.0/0.0.0.0:49010] timeout=300.0>) 2021-09-14 02:26:44,739 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0xd2 type=client open_count=1 channel=[id: 0x0707c476, 0.0.0.0/0.0.0.0:52656] timeout=300.0>) 2021-09-14 02:26:46,753 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0xd3 type=client open_count=1 channel=[id: 0xa38769a8, 0.0.0.0/0.0.0.0:50170] timeout=300.0>) 2021-09-14 02:26:48,766 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0xd4 type=client open_count=1 channel=[id: 0x02c4af7b, 0.0.0.0/0.0.0.0:37234] timeout=300.0>) 2021-09-14 02:26:50,780 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0xd5 type=client open_count=1 channel=[id: 0x6043bf41, 0.0.0.0/0.0.0.0:47371] timeout=300.0>) 2021-09-14 02:26:52,792 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0xd6 type=client open_count=1 channel=[id: 0xa2b62d96, 0.0.0.0/0.0.0.0:57067] timeout=300.0>) 2021-09-14 02:26:54,805 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Traceback (most recent call last): File "pytests/bucket_collections/collections_base.py", line 63, in setUp self.collection_setup() File "pytests/bucket_collections/collections_base.py", line 158, in collection_setup self.create_sdk_clients(self.task_manager.number_of_threads, File "pytests/bucket_collections/collections_base.py", line 111, in create_sdk_clients sdk_client_pool.create_clients( File "lib/sdk_client3.py", line 111, in create_clients self.clients[bucket.name]["idle_clients"].append(SDKClient( File "lib/sdk_client3.py", line 241, in __init__ self.__create_conn() File "lib/sdk_client3.py", line 303, in _SDKClient__create_conn self.bucketObj.waitUntilReady( UnambiguousTimeoutException: com.couchbase.client.core.error.UnambiguousTimeoutException: WaitUntilReady timed out {"bucket":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","checkedServices":["KV"],"currentState":"ONLINE","desiredState":"ONLINE","services":{"mgmt":[{"last_activity_us":20133,"state":"connected","id":"0x1a3b45a1","remote":"172.23.123.119:8091","local":"172.23.123.71:35778"},{"last_activity_us":9974,"state":"connected","id":"0x84aef206","remote":"172.23.121.221:8091","local":"172.23.123.71:52842"},{"last_activity_us":318,"state":"connected","id":"0x026bcac1","remote":"172.23.121.222:8091","local":"172.23.123.71:55154"}],"kv":[{"last_activity_us":2005106,"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x86422ab5","remote":"172.23.123.119:11210","local":"172.23.123.71:47718"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x034e5285","remote":"172.23.123.119:11210","local":"172.23.123.71:47720"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xd99e707f","remote":"172.23.123.119:11210","local":"172.23.123.71:47722"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x25bead26","remote":"172.23.123.119:11210","local":"172.23.123.71:47724"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x200d1a08","remote":"172.23.123.119:11210","local":"172.23.123.71:47726"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x010cbbaa","remote":"172.23.123.119:11210","local":"172.23.123.71:47728"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xbac61f2d","remote":"172.23.123.119:11210","local":"172.23.123.71:47730"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xbb245f01","remote":"172.23.123.119:11210","local":"172.23.123.71:47732"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x26c004a5","remote":"172.23.123.119:11210","local":"172.23.123.71:47734"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x334031cd","remote":"172.23.123.119:11210","local":"172.23.123.71:47736"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x7eb6e453","remote":"172.23.123.119:11210","local":"172.23.123.71:47738"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x7ebc5cc5","remote":"172.23.123.119:11210","local":"172.23.123.71:47740"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xd3f1632b","remote":"172.23.123.119:11210","local":"172.23.123.71:47742"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xfc56fd8e","remote":"172.23.123.119:11210","local":"172.23.123.71:47744"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x61f9d6a4","remote":"172.23.123.119:11210","local":"172.23.123.71:47746"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xbdcd0eeb","remote":"172.23.123.119:11210","local":"172.23.123.71:47748"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xc6ec3777","remote":"172.23.123.119:11210","local":"172.23.123.71:47750"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x6f4d5105","remote":"172.23.123.119:11210","local":"172.23.123.71:47752"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xa3c645eb","remote":"172.23.123.119:11210","local":"172.23.123.71:47754"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x0fc4be55","remote":"172.23.123.119:11210","local":"172.23.123.71:47756"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x922d05fb","remote":"172.23.123.119:11210","local":"172.23.123.71:47760"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xb152477c","remote":"172.23.123.119:11210","local":"172.23.123.71:47758"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x602af04b","remote":"172.23.123.119:11210","local":"172.23.123.71:47762"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xdd94678f","remote":"172.23.123.119:11210","local":"172.23.123.71:47766"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xc77ed8ac","remote":"172.23.123.119:11210","local":"172.23.123.71:47764"},{"last_activity_us":5948,"state":"connected","id":"0xdd16225d","remote":"172.23.123.119:11210","local":"172.23.123.71:47716"},{"last_activity_us":2006067,"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x5b81bd71","remote":"172.23.121.221:11210","local":"172.23.123.71:56132"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xdd09dfb6","remote":"172.23.121.221:11210","local":"172.23.123.71:56134"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x2792b130","remote":"172.23.121.221:11210","local":"172.23.123.71:56136"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xe5665213","remote":"172.23.121.221:11210","local":"172.23.123.71:56150"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x561a7fb6","remote":"172.23.121.221:11210","local":"172.23.123.71:56138"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xa60f3448","remote":"172.23.121.221:11210","local":"172.23.123.71:56140"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x01ec24f4","remote":"172.23.121.221:11210","local":"172.23.123.71:56142"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xc8b0ddc3","remote":"172.23.121.221:11210","local":"172.23.123.71:56144"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x01a4d88a","remote":"172.23.121.221:11210","local":"172.23.123.71:56146"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x4ddd29fd","remote":"172.23.121.221:11210","local":"172.23.123.71:56152"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x30be67e5","remote":"172.23.121.221:11210","local":"172.23.123.71:56194"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x2e766044","remote":"172.23.121.221:11210","local":"172.23.123.71:56148"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x700978bb","remote":"172.23.121.221:11210","local":"172.23.123.71:56166"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x1843bc45","remote":"172.23.121.221:11210","local":"172.23.123.71:56180"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x6ff4f4fa","remote":"172.23.121.221:11210","local":"172.23.123.71:56222"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x6533049a","remote":"172.23.121.221:11210","local":"172.23.123.71:56154"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x24f2aeaf","remote":"172.23.121.221:11210","local":"172.23.123.71:56196"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x09b7c546","remote":"172.23.121.221:11210","local":"172.23.123.71:56210"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x50322683","remote":"172.23.121.221:11210","local":"172.23.123.71:56168"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xd442d49d","remote":"172.23.121.221:11210","local":"172.23.123.71:56182"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x122b681e","remote":"172.23.121.221:11210","local":"172.23.123.71:56224"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xbd915536","remote":"172.23.121.221:11210","local":"172.23.123.71:56156"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x1b761d5b","remote":"172.23.121.221:11210","local":"172.23.123.71:56198"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xcc821fbc","remote":"172.23.121.221:11210","local":"172.23.123.71:56212"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xdbd9d330","remote":"172.23.121.221:11210","local":"172.23.123.71:56170"},{"last_activity_us":5997,"state":"connected","id":"0x311b62e2","remote":"172.23.121.221:11210","local":"172.23.123.71:56128"},{"last_activity_us":2006257,"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x6e2ff5d3","remote":"172.23.121.222:11210","local":"172.23.123.71:52064"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x4f50ce3b","remote":"172.23.121.222:11210","local":"172.23.123.71:52106"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xccf3aef9","remote":"172.23.121.222:11210","local":"172.23.123.71:52038"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x57c4ccc0","remote":"172.23.121.222:11210","local":"172.23.123.71:52080"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x552e236b","remote":"172.23.121.222:11210","local":"172.23.123.71:52094"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x954bdd3f","remote":"172.23.121.222:11210","local":"172.23.123.71:52052"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xd1211f02","remote":"172.23.121.222:11210","local":"172.23.123.71:52066"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x045f8303","remote":"172.23.121.222:11210","local":"172.23.123.71:52108"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x95c5e2f2","remote":"172.23.121.222:11210","local":"172.23.123.71:52040"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xa35d6186","remote":"172.23.121.222:11210","local":"172.23.123.71:52082"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x823afd76","remote":"172.23.121.222:11210","local":"172.23.123.71:52096"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x7a0b1b5b","remote":"172.23.121.222:11210","local":"172.23.123.71:52054"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x1bbbd86f","remote":"172.23.121.222:11210","local":"172.23.123.71:52068"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x4f4aab28","remote":"172.23.121.222:11210","local":"172.23.123.71:52110"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x581605a9","remote":"172.23.121.222:11210","local":"172.23.123.71:52042"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x90711dd8","remote":"172.23.121.222:11210","local":"172.23.123.71:52084"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x242b3792","remote":"172.23.121.222:11210","local":"172.23.123.71:52098"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x07353a78","remote":"172.23.121.222:11210","local":"172.23.123.71:52056"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x1c97ad17","remote":"172.23.121.222:11210","local":"172.23.123.71:52070"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x0e7e0832","remote":"172.23.121.222:11210","local":"172.23.123.71:52112"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x0527f2ec","remote":"172.23.121.222:11210","local":"172.23.123.71:52044"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xed49eb44","remote":"172.23.121.222:11210","local":"172.23.123.71:52086"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x06876a10","remote":"172.23.121.222:11210","local":"172.23.123.71:52100"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xb61dc1d6","remote":"172.23.121.222:11210","local":"172.23.123.71:52058"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xe976b6a6","remote":"172.23.121.222:11210","local":"172.23.123.71:52072"},{"last_activity_us":6245,"state":"connected","id":"0x6eaf3499","remote":"172.23.121.222:11210","local":"172.23.123.71:52010"}]},"state":{"current_stage":"BUCKET_NODES_HEALTHY","current_stage_since_ms":4,"timings_ms":{"BUCKET_CONFIG_READY":0,"CONFIG_LOAD":130,"BUCKET_NODES_HEALTHY":9},"total_ms":109438},"timeoutMs":120000} ERROR ====================================================================== ERROR: test_data_load_collections_with_graceful_failover_recovery (bucket_collections.collections_rebalance.CollectionsRebalance) ---------------------------------------------------------------------- Traceback (most recent call last): File "pytests/bucket_collections/collections_rebalance.py", line 22, in setUp super(CollectionsRebalance, self).setUp() File "pytests/bucket_collections/collections_base.py", line 65, in setUp self.handle_setup_exception(exception) File "pytests/basetestcase.py", line 609, in handle_setup_exception raise exception_obj UnambiguousTimeoutException: com.couchbase.client.core.error.UnambiguousTimeoutException: WaitUntilReady timed out {"bucket":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","checkedServices":["KV"],"currentState":"ONLINE","desiredState":"ONLINE","services":{"mgmt":[{"last_activity_us":20133,"state":"connected","id":"0x1a3b45a1","remote":"172.23.123.119:8091","local":"172.23.123.71:35778"},{"last_activity_us":9974,"state":"connected","id":"0x84aef206","remote":"172.23.121.221:8091","local":"172.23.123.71:52842"},{"last_activity_us":318,"state":"connected","id":"0x026bcac1","remote":"172.23.121.222:8091","local":"172.23.123.71:55154"}],"kv":[{"last_activity_us":2005106,"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x86422ab5","remote":"172.23.123.119:11210","local":"172.23.123.71:47718"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x034e5285","remote":"172.23.123.119:11210","local":"172.23.123.71:47720"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xd99e707f","remote":"172.23.123.119:11210","local":"172.23.123.71:47722"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x25bead26","remote":"172.23.123.119:11210","local":"172.23.123.71:47724"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x200d1a08","remote":"172.23.123.119:11210","local":"172.23.123.71:47726"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x010cbbaa","remote":"172.23.123.119:11210","local":"172.23.123.71:47728"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xbac61f2d","remote":"172.23.123.119:11210","local":"172.23.123.71:47730"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xbb245f01","remote":"172.23.123.119:11210","local":"172.23.123.71:47732"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x26c004a5","remote":"172.23.123.119:11210","local":"172.23.123.71:47734"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x334031cd","remote":"172.23.123.119:11210","local":"172.23.123.71:47736"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x7eb6e453","remote":"172.23.123.119:11210","local":"172.23.123.71:47738"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x7ebc5cc5","remote":"172.23.123.119:11210","local":"172.23.123.71:47740"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xd3f1632b","remote":"172.23.123.119:11210","local":"172.23.123.71:47742"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xfc56fd8e","remote":"172.23.123.119:11210","local":"172.23.123.71:47744"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x61f9d6a4","remote":"172.23.123.119:11210","local":"172.23.123.71:47746"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xbdcd0eeb","remote":"172.23.123.119:11210","local":"172.23.123.71:47748"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xc6ec3777","remote":"172.23.123.119:11210","local":"172.23.123.71:47750"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x6f4d5105","remote":"172.23.123.119:11210","local":"172.23.123.71:47752"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xa3c645eb","remote":"172.23.123.119:11210","local":"172.23.123.71:47754"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x0fc4be55","remote":"172.23.123.119:11210","local":"172.23.123.71:47756"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x922d05fb","remote":"172.23.123.119:11210","local":"172.23.123.71:47760"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xb152477c","remote":"172.23.123.119:11210","local":"172.23.123.71:47758"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x602af04b","remote":"172.23.123.119:11210","local":"172.23.123.71:47762"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xdd94678f","remote":"172.23.123.119:11210","local":"172.23.123.71:47766"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xc77ed8ac","remote":"172.23.123.119:11210","local":"172.23.123.71:47764"},{"last_activity_us":5948,"state":"connected","id":"0xdd16225d","remote":"172.23.123.119:11210","local":"172.23.123.71:47716"},{"last_activity_us":2006067,"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x5b81bd71","remote":"172.23.121.221:11210","local":"172.23.123.71:56132"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xdd09dfb6","remote":"172.23.121.221:11210","local":"172.23.123.71:56134"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x2792b130","remote":"172.23.121.221:11210","local":"172.23.123.71:56136"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xe5665213","remote":"172.23.121.221:11210","local":"172.23.123.71:56150"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x561a7fb6","remote":"172.23.121.221:11210","local":"172.23.123.71:56138"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xa60f3448","remote":"172.23.121.221:11210","local":"172.23.123.71:56140"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x01ec24f4","remote":"172.23.121.221:11210","local":"172.23.123.71:56142"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xc8b0ddc3","remote":"172.23.121.221:11210","local":"172.23.123.71:56144"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x01a4d88a","remote":"172.23.121.221:11210","local":"172.23.123.71:56146"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x4ddd29fd","remote":"172.23.121.221:11210","local":"172.23.123.71:56152"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x30be67e5","remote":"172.23.121.221:11210","local":"172.23.123.71:56194"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x2e766044","remote":"172.23.121.221:11210","local":"172.23.123.71:56148"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x700978bb","remote":"172.23.121.221:11210","local":"172.23.123.71:56166"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x1843bc45","remote":"172.23.121.221:11210","local":"172.23.123.71:56180"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x6ff4f4fa","remote":"172.23.121.221:11210","local":"172.23.123.71:56222"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x6533049a","remote":"172.23.121.221:11210","local":"172.23.123.71:56154"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x24f2aeaf","remote":"172.23.121.221:11210","local":"172.23.123.71:56196"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x09b7c546","remote":"172.23.121.221:11210","local":"172.23.123.71:56210"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x50322683","remote":"172.23.121.221:11210","local":"172.23.123.71:56168"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xd442d49d","remote":"172.23.121.221:11210","local":"172.23.123.71:56182"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x122b681e","remote":"172.23.121.221:11210","local":"172.23.123.71:56224"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xbd915536","remote":"172.23.121.221:11210","local":"172.23.123.71:56156"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x1b761d5b","remote":"172.23.121.221:11210","local":"172.23.123.71:56198"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xcc821fbc","remote":"172.23.121.221:11210","local":"172.23.123.71:56212"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xdbd9d330","remote":"172.23.121.221:11210","local":"172.23.123.71:56170"},{"last_activity_us":5997,"state":"connected","id":"0x311b62e2","remote":"172.23.121.221:11210","local":"172.23.123.71:56128"},{"last_activity_us":2006257,"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x6e2ff5d3","remote":"172.23.121.222:11210","local":"172.23.123.71:52064"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x4f50ce3b","remote":"172.23.121.222:11210","local":"172.23.123.71:52106"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xccf3aef9","remote":"172.23.121.222:11210","local":"172.23.123.71:52038"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x57c4ccc0","remote":"172.23.121.222:11210","local":"172.23.123.71:52080"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x552e236b","remote":"172.23.121.222:11210","local":"172.23.123.71:52094"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x954bdd3f","remote":"172.23.121.222:11210","local":"172.23.123.71:52052"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xd1211f02","remote":"172.23.121.222:11210","local":"172.23.123.71:52066"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x045f8303","remote":"172.23.121.222:11210","local":"172.23.123.71:52108"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x95c5e2f2","remote":"172.23.121.222:11210","local":"172.23.123.71:52040"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xa35d6186","remote":"172.23.121.222:11210","local":"172.23.123.71:52082"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x823afd76","remote":"172.23.121.222:11210","local":"172.23.123.71:52096"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x7a0b1b5b","remote":"172.23.121.222:11210","local":"172.23.123.71:52054"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x1bbbd86f","remote":"172.23.121.222:11210","local":"172.23.123.71:52068"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x4f4aab28","remote":"172.23.121.222:11210","local":"172.23.123.71:52110"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x581605a9","remote":"172.23.121.222:11210","local":"172.23.123.71:52042"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x90711dd8","remote":"172.23.121.222:11210","local":"172.23.123.71:52084"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x242b3792","remote":"172.23.121.222:11210","local":"172.23.123.71:52098"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x07353a78","remote":"172.23.121.222:11210","local":"172.23.123.71:52056"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x1c97ad17","remote":"172.23.121.222:11210","local":"172.23.123.71:52070"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x0e7e0832","remote":"172.23.121.222:11210","local":"172.23.123.71:52112"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x0527f2ec","remote":"172.23.121.222:11210","local":"172.23.123.71:52044"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xed49eb44","remote":"172.23.121.222:11210","local":"172.23.123.71:52086"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0x06876a10","remote":"172.23.121.222:11210","local":"172.23.123.71:52100"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xb61dc1d6","remote":"172.23.121.222:11210","local":"172.23.123.71:52058"},{"namespace":"xPCSEpyWGbSry-l%bEoWsbxagO0v0T0qOEnsuoW--fPXS_kVLK-48-366000","state":"connected","id":"0xe976b6a6","remote":"172.23.121.222:11210","local":"172.23.123.71:52072"},{"last_activity_us":6245,"state":"connected","id":"0x6eaf3499","remote":"172.23.121.222:11210","local":"172.23.123.71:52010"}]},"state":{"current_stage":"BUCKET_NODES_HEALTHY","current_stage_since_ms":6,"timings_ms":{"BUCKET_CONFIG_READY":0,"CONFIG_LOAD":130,"BUCKET_NODES_HEALTHY":9},"total_ms":109440},"timeoutMs":120000} ---------------------------------------------------------------------- Ran 1 test in 289.636s FAILED (errors=1) During the test, Remote Connections: 36, Disconnections: 36 SDK Connections: 0, Disconnections: 0 summary so far suite bucket_collections.collections_rebalance.CollectionsRebalance , pass 0 , fail 2 failures so far... bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_graceful_failover_recovery bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_graceful_failover_recovery testrunner logs, diags and results are available under /data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/logs/testrunner-21-Sep-14_02-20-11/test_2 Logs will be stored at /data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/logs/testrunner-21-Sep-14_02-20-11/test_3 guides/gradlew --refresh-dependencies testrunner -P jython=/opt/jython/bin/jython -P 'args=-i /tmp/testexec.38389.ini GROUP=P0_failover_and_recovery_dgm,rerun=False,get-cbcollect-info=True,infra_log_level=critical,log_level=error,bucket_storage=magma,enable_dp=True,upgrade_version=7.1.0-1277 -t bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_hard_failover_recovery,nodes_init=3,nodes_failover=1,recovery_type=full,bucket_spec=dgm.buckets_for_rebalance_tests,data_load_stage=during,dgm=40,skip_validations=False,GROUP=P0_failover_and_recovery_dgm' Test Input params: {'data_load_stage': 'during', 'conf_file': 'conf/collections/collections_failover_dgm.conf', 'upgrade_version': '7.1.0-1277', 'dgm': '40', 'spec': 'collections_failover_dgm', 'rerun': 'False', 'num_nodes': 5, 'GROUP': 'P0_failover_and_recovery_dgm', 'enable_dp': 'True', 'bucket_spec': 'dgm.buckets_for_rebalance_tests', 'case_number': 3, 'cluster_name': 'testexec.38389', 'nodes_failover': '1', 'ini': '/tmp/testexec.38389.ini', 'get-cbcollect-info': 'True', 'recovery_type': 'full', 'log_level': 'error', 'bucket_storage': 'magma', 'skip_validations': 'False', 'logs_folder': '/data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/logs/testrunner-21-Sep-14_02-20-11/test_3', 'nodes_init': '3', 'infra_log_level': 'critical'} test_data_load_collections_with_hard_failover_recovery (bucket_collections.collections_rebalance.CollectionsRebalance) ... Got this failure java.net.ConnectException: Connection refused: /172.23.123.119:8091 during connect (<_realsocket at 0xdf type=client open_count=1 channel=[id: 0xc064032a, 0.0.0.0/0.0.0.0:37731] timeout=300.0>) 2021-09-14 02:30:32,714 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.123.119:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.123.119:8091 during connect (<_realsocket at 0xe0 type=client open_count=1 channel=[id: 0x6b8414a9, 0.0.0.0/0.0.0.0:55521] timeout=300.0>) 2021-09-14 02:30:34,726 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.123.119:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.123.119:8091 during connect (<_realsocket at 0xe1 type=client open_count=1 channel=[id: 0xbf090a6a, 0.0.0.0/0.0.0.0:41726] timeout=300.0>) 2021-09-14 02:30:36,740 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.123.119:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.123.119:8091 during connect (<_realsocket at 0xe2 type=client open_count=1 channel=[id: 0x67d273b6, 0.0.0.0/0.0.0.0:58263] timeout=300.0>) 2021-09-14 02:30:38,753 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.123.119:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.123.119:8091 during connect (<_realsocket at 0xe3 type=client open_count=1 channel=[id: 0xb0c363a9, 0.0.0.0/0.0.0.0:47215] timeout=300.0>) 2021-09-14 02:30:40,763 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.123.119:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.222:8091 during connect (<_realsocket at 0xe4 type=client open_count=1 channel=[id: 0x3c395982, 0.0.0.0/0.0.0.0:44586] timeout=300.0>) 2021-09-14 02:30:45,980 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.222:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.222:8091 during connect (<_realsocket at 0xe5 type=client open_count=1 channel=[id: 0xc3b3bad8, 0.0.0.0/0.0.0.0:60829] timeout=300.0>) 2021-09-14 02:30:47,996 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.222:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.222:8091 during connect (<_realsocket at 0xe6 type=client open_count=1 channel=[id: 0x4d6c35a1, 0.0.0.0/0.0.0.0:60027] timeout=300.0>) 2021-09-14 02:30:50,007 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.222:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.222:8091 during connect (<_realsocket at 0xe7 type=client open_count=1 channel=[id: 0x1f07c7b7, 0.0.0.0/0.0.0.0:46162] timeout=300.0>) 2021-09-14 02:30:52,019 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.222:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.221:8091 during connect (<_realsocket at 0xe8 type=client open_count=1 channel=[id: 0x2edb92a6, 0.0.0.0/0.0.0.0:45725] timeout=300.0>) 2021-09-14 02:30:59,523 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.221:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.221:8091 during connect (<_realsocket at 0xe9 type=client open_count=1 channel=[id: 0xfd2e9b82, 0.0.0.0/0.0.0.0:51051] timeout=300.0>) 2021-09-14 02:31:01,536 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.221:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.221:8091 during connect (<_realsocket at 0xea type=client open_count=1 channel=[id: 0xcace26b2, 0.0.0.0/0.0.0.0:57794] timeout=300.0>) 2021-09-14 02:31:03,548 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.221:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.221:8091 during connect (<_realsocket at 0xeb type=client open_count=1 channel=[id: 0xd5cf7c82, 0.0.0.0/0.0.0.0:59477] timeout=300.0>) 2021-09-14 02:31:05,559 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.221:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.221:8091 during connect (<_realsocket at 0xec type=client open_count=1 channel=[id: 0xcb4d3b30, 0.0.0.0/0.0.0.0:38453] timeout=300.0>) 2021-09-14 02:31:07,571 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.221:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.8:8091 during connect (<_realsocket at 0xed type=client open_count=1 channel=[id: 0x86c4a3d7, 0.0.0.0/0.0.0.0:49131] timeout=300.0>) 2021-09-14 02:31:13,121 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.8:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.8:8091 during connect (<_realsocket at 0xee type=client open_count=1 channel=[id: 0x6579f700, 0.0.0.0/0.0.0.0:36471] timeout=300.0>) 2021-09-14 02:31:15,132 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.8:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.8:8091 during connect (<_realsocket at 0xef type=client open_count=1 channel=[id: 0x775e0702, 0.0.0.0/0.0.0.0:44207] timeout=300.0>) 2021-09-14 02:31:17,147 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.8:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.8:8091 during connect (<_realsocket at 0xf0 type=client open_count=1 channel=[id: 0x747bfb3a, 0.0.0.0/0.0.0.0:55020] timeout=300.0>) 2021-09-14 02:31:19,157 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.8:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.8:8091 during connect (<_realsocket at 0xf1 type=client open_count=1 channel=[id: 0xc3389264, 0.0.0.0/0.0.0.0:47207] timeout=300.0>) 2021-09-14 02:31:21,170 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.8:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0xf2 type=client open_count=1 channel=[id: 0xa132f7de, 0.0.0.0/0.0.0.0:60579] timeout=300.0>) 2021-09-14 02:31:28,605 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0xf3 type=client open_count=1 channel=[id: 0x113ffac4, 0.0.0.0/0.0.0.0:46033] timeout=300.0>) 2021-09-14 02:31:30,617 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0xf4 type=client open_count=1 channel=[id: 0x8255446e, 0.0.0.0/0.0.0.0:52011] timeout=300.0>) 2021-09-14 02:31:32,628 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0xf5 type=client open_count=1 channel=[id: 0x99ec23cb, 0.0.0.0/0.0.0.0:52907] timeout=300.0>) 2021-09-14 02:31:34,641 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0xf6 type=client open_count=1 channel=[id: 0xa84e24f5, 0.0.0.0/0.0.0.0:56287] timeout=300.0>) 2021-09-14 02:31:36,653 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0xf7 type=client open_count=1 channel=[id: 0xfba3380e, 0.0.0.0/0.0.0.0:48630] timeout=300.0>) 2021-09-14 02:31:38,664 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0xf8 type=client open_count=1 channel=[id: 0x146020c2, 0.0.0.0/0.0.0.0:40202] timeout=300.0>) 2021-09-14 02:31:40,677 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0xf9 type=client open_count=1 channel=[id: 0x20694d1b, 0.0.0.0/0.0.0.0:41181] timeout=300.0>) 2021-09-14 02:31:42,690 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0xfa type=client open_count=1 channel=[id: 0x349d368f, 0.0.0.0/0.0.0.0:52511] timeout=300.0>) 2021-09-14 02:31:44,703 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused ############################## 172.23.123.119: 1 core dump seen running: //opt/couchbase/bin/minidump-2-core /opt/couchbase/var/lib/couchbase/crash/f344d4be-47fa-4521-4c835ca8-226ee566.dmp > /opt/couchbase/var/lib/couchbase/crash/f344d4be-47fa-4521-4c835ca8-226ee566.core running: gdb --batch /opt/couchbase/bin/memcached -c /opt/couchbase/var/lib/couchbase/crash/f344d4be-47fa-4521-4c835ca8-226ee566.core -ex "bt full" -ex quit 172.23.123.119: Stack Trace of first crash - f344d4be-47fa-4521-4c835ca8-226ee566.dmp Core was generated by `/opt/couchbase/bin/memcached -C /opt/couchbase/var/lib/couchbase/config/memcach'. #0 0x00007f7d3c685337 in raise () from /lib64/libc.so.6 #0 0x00007f7d3c685337 in raise () from /lib64/libc.so.6 No symbol table info available. #1 0x00007f7d3c686a28 in abort () from /lib64/libc.so.6 No symbol table info available. #2 0x00007f7d3cfd063c in __gnu_cxx::__verbose_terminate_handler () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/libsupc++/vterminate.cc:95 terminating = false t = #3 0x0000000000a99a3b in backtrace_terminate_handler() () No symbol table info available. #4 0x00007f7d3cfdb8f6 in __cxxabiv1::__terminate(void (*)()) () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/libsupc++/eh_terminate.cc:48 No locals. #5 0x00007f7d3cfdb961 in std::terminate () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/libsupc++/eh_terminate.cc:58 No locals. #6 0x00007f7d3cfdbc46 in __cxxabiv1::__cxa_rethrow () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/libsupc++/eh_throw.cc:133 globals = header = #7 0x00000000004c3efe in EPBucket::compactionCompletionCallback(CompactionContext&) [clone .cold] () No symbol table info available. #8 0x00000000008524f2 in MagmaKVStore::compactDBInternal(std::unique_lock&, std::shared_ptr) () No symbol table info available. #9 0x0000000000852f76 in MagmaKVStore::compactDB(std::unique_lock&, std::shared_ptr) () No symbol table info available. #10 0x00000000007eb3ba in EPBucket::compactInternal(LockedVBucketPtr&, CompactionConfig&) () No symbol table info available. #11 0x00000000007ec981 in EPBucket::doCompact(Vbid, CompactionConfig&, std::vector >&) () No symbol table info available. #12 0x0000000000706666 in CompactTask::run() () No symbol table info available. #13 0x0000000000a0ae52 in GlobalTask::execute() () No symbol table info available. #14 0x0000000000a07f75 in FollyExecutorPool::TaskProxy::scheduleViaCPUPool()::{lambda()#2}::operator()() const () No symbol table info available. #15 0x0000000000b59b30 in folly::ThreadPoolExecutor::runTask(std::shared_ptr const&, folly::ThreadPoolExecutor::Task&&) () No symbol table info available. #16 0x0000000000b418ea in folly::CPUThreadPoolExecutor::threadRun(std::shared_ptr) () No symbol table info available. #17 0x0000000000b5cae9 in void folly::detail::function::FunctionTraits::callBig))(std::shared_ptr)> >(folly::detail::function::Data&) () No symbol table info available. #18 0x0000000000a07c04 in void folly::detail::function::FunctionTraits::callBig&&)::{lambda()#1}>(folly::detail::function::Data&) () No symbol table info available. #19 0x00007f7d3d004d40 in execute_native_thread_routine () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/thread.cc:80 No locals. #20 0x00007f7d3ee24e65 in start_thread () from /lib64/libpthread.so.0 No symbol table info available. #21 0x00007f7d3c74d88d in clone () from /lib64/libc.so.6 No symbol table info available. ############################## running: gdb -p `(pidof memcached)` -ex "thread apply all bt" -ex detach -ex quit [Thread debugging using libthread_db enabled] Using host libthread_db library "/lib64/libthread_db.so.1". Loaded symbols for /lib64/libpthread.so.0 Reading symbols from /lib64/librt.so.1...(no debugging symbols found)...done. Loaded symbols for /lib64/librt.so.1 Reading symbols from /opt/couchbase/bin/../lib/liblz4.so.1...Reading symbols from /opt/couchbase/bin/../lib/liblz4.so.1...(no debugging symbols found)...done. (no debugging symbols found)...done. Loaded symbols for /opt/couchbase/bin/../lib/liblz4.so.1 Reading symbols from /lib64/libsnappy.so.1...Reading symbols from /lib64/libsnappy.so.1...(no debugging symbols found)...done. (no debugging symbols found)...done. Loaded symbols for /lib64/libsnappy.so.1 Reading symbols from /opt/couchbase/bin/../lib/libjemalloc.so.2...Reading symbols from /opt/couchbase/bin/../lib/libjemalloc.so.2...(no debugging symbols found)...done. (no debugging symbols found)...done. Loaded symbols for /opt/couchbase/bin/../lib/libjemalloc.so.2 Reading symbols from /opt/couchbase/bin/../lib/libevent_core-2.1.so.7...Reading symbols from /opt/couchbase/bin/../lib/libevent_core-2.1.so.7...(no debugging symbols found)...done. (no debugging symbols found)...done. Loaded symbols for /opt/couchbase/bin/../lib/libevent_core-2.1.so.7 Reading symbols from /opt/couchbase/bin/../lib/libevent_extra-2.1.so.7...Reading symbols from /opt/couchbase/bin/../lib/libevent_extra-2.1.so.7...(no debugging symbols found)...done. (no debugging symbols found)...done. Loaded symbols for /opt/couchbase/bin/../lib/libevent_extra-2.1.so.7 Reading symbols from /opt/couchbase/bin/../lib/libevent_pthreads-2.1.so.7...Reading symbols from /opt/couchbase/bin/../lib/libevent_pthreads-2.1.so.7...(no debugging symbols found)...done. (no debugging symbols found)...done. Loaded symbols for /opt/couchbase/bin/../lib/libevent_pthreads-2.1.so.7 Reading symbols from /opt/couchbase/bin/../lib/libevent_openssl-2.1.so.7...Reading symbols from /opt/couchbase/bin/../lib/libevent_openssl-2.1.so.7...(no debugging symbols found)...done. (no debugging symbols found)...done. Loaded symbols for /opt/couchbase/bin/../lib/libevent_openssl-2.1.so.7 Reading symbols from /opt/couchbase/bin/../lib/libssl.so.1.1...Reading symbols from /opt/couchbase/bin/../lib/libssl.so.1.1...(no debugging symbols found)...done. (no debugging symbols found)...done. Loaded symbols for /opt/couchbase/bin/../lib/libssl.so.1.1 Reading symbols from /opt/couchbase/bin/../lib/libcrypto.so.1.1...Reading symbols from /opt/couchbase/bin/../lib/libcrypto.so.1.1...(no debugging symbols found)...done. (no debugging symbols found)...done. Loaded symbols for /opt/couchbase/bin/../lib/libcrypto.so.1.1 Reading symbols from /opt/couchbase/bin/../lib/libstdc++.so.6...done. Loaded symbols for /opt/couchbase/bin/../lib/libstdc++.so.6 Reading symbols from /lib64/libm.so.6...(no debugging symbols found)...done. Loaded symbols for /lib64/libm.so.6 Reading symbols from /opt/couchbase/bin/../lib/libgcc_s.so.1...done. Loaded symbols for /opt/couchbase/bin/../lib/libgcc_s.so.1 Reading symbols from /lib64/libc.so.6...(no debugging symbols found)...done. Loaded symbols for /lib64/libc.so.6 Reading symbols from /lib64/ld-linux-x86-64.so.2...(no debugging symbols found)...done. Loaded symbols for /lib64/ld-linux-x86-64.so.2 0x00007f0399aa9e63 in epoll_wait () from /lib64/libc.so.6 To enable execution of this file add add-auto-load-safe-path /opt/couchbase/lib/libstdc++.so.6.0.28-gdb.py line to your configuration file "/root/.gdbinit". To completely disable this security protection add set auto-load safe-path / line to your configuration file "/root/.gdbinit". For more information about this security protection see the "Auto-loading safe path" section in the GDB manual. E.g., run from the shell: info "(gdb)Auto-loading safe path" Thread 47 (Thread 0x7f0398fff700 (LWP 35643)): #0 0x00007f039c184da2 in pthread_cond_timedwait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f039b68a5a7 in background_thread_entry () from /opt/couchbase/bin/../lib/libjemalloc.so.2 #2 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #3 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 46 (Thread 0x7f03981fe700 (LWP 35644)): #0 0x00007f039c184da2 in pthread_cond_timedwait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x0000000000ac40aa in spdlog::details::thread_pool::process_next_msg_() () #2 0x0000000000ac43f8 in _ZNSt6thread11_State_implINS_8_InvokerISt5tupleIJZN6spdlog7details11thread_poolC4EmmSt8functionIFvvEEEUlvE_EEEEE6_M_runEv () #3 0x00007f039a360d40 in execute_native_thread_routine () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/thread.cc:80 #4 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #5 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 45 (Thread 0x7f03979fd700 (LWP 35645)): #0 0x00007f039c184da2 in pthread_cond_timedwait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x0000000000ac3e78 in _ZNSt6thread11_State_implINS_8_InvokerISt5tupleIJZN6spdlog7details15periodic_workerC4ERKSt8functionIFvvEENSt6chrono8durationIlSt5ratioILl1ELl1EEEEEUlvE_EEEEE6_M_runEv () #2 0x00007f039a360d40 in execute_native_thread_routine () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/thread.cc:80 #3 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #4 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 44 (Thread 0x7f03971fc700 (LWP 35646)): #0 0x00007f0399a9ebed in poll () from /lib64/libc.so.6 #1 0x000000000059e201 in check_stdin_thread(void*) () #2 0x0000000000af6fe9 in platform_thread_wrap(void*) () #3 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #4 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 43 (Thread 0x7f03951f8700 (LWP 35650)): #0 0x00007f039c184da2 in pthread_cond_timedwait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00000000005e4906 in ExternalAuthManagerThread::run() () #2 0x0000000000b044f3 in Couchbase::Thread::thread_entry() () #3 0x0000000000af6fe9 in platform_thread_wrap(void*) () #4 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #5 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 42 (Thread 0x7f038ffff700 (LWP 35651)): #0 0x00007f039c184da2 in pthread_cond_timedwait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x000000000066c827 in AuditImpl::consume_events() () #2 0x0000000000af6fe9 in platform_thread_wrap(void*) () #3 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #4 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 41 (Thread 0x7f03959f9700 (LWP 35652)): #0 0x00007f039c1849f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x0000000000a3899b in worker_thread () #2 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #3 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 40 (Thread 0x7f03961fa700 (LWP 35653)): #0 0x00007f039c1849f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x0000000000a3899b in worker_thread () #2 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #3 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 39 (Thread 0x7f03969fb700 (LWP 35654)): #0 0x00007f0399a9ebed in poll () from /lib64/libc.so.6 #1 0x0000000000a2f908 in master_thread () #2 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #3 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 38 (Thread 0x7f038f7fe700 (LWP 35655)): #0 0x00007f0399aa9e63 in epoll_wait () from /lib64/libc.so.6 #1 0x00007f039b4513c5 in epoll_dispatch () from /opt/couchbase/bin/../lib/libevent_core-2.1.so.7 #2 0x00007f039b448376 in event_base_loop () from /opt/couchbase/bin/../lib/libevent_core-2.1.so.7 #3 0x0000000000b63217 in folly::EventBase::loopBody(int, bool) () #4 0x0000000000b636e6 in folly::EventBase::loop() () #5 0x0000000000b650a6 in folly::EventBase::loopForever() () #6 0x00000000005a3549 in worker_libevent(void*) () #7 0x0000000000af6fe9 in platform_thread_wrap(void*) () #8 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #9 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 37 (Thread 0x7f038effd700 (LWP 35656)): #0 0x00007f0399aa9e63 in epoll_wait () from /lib64/libc.so.6 #1 0x00007f039b4513c5 in epoll_dispatch () from /opt/couchbase/bin/../lib/libevent_core-2.1.so.7 #2 0x00007f039b448376 in event_base_loop () from /opt/couchbase/bin/../lib/libevent_core-2.1.so.7 #3 0x0000000000b63217 in folly::EventBase::loopBody(int, bool) () #4 0x0000000000b636e6 in folly::EventBase::loop() () #5 0x0000000000b650a6 in folly::EventBase::loopForever() () #6 0x00000000005a3549 in worker_libevent(void*) () #7 0x0000000000af6fe9 in platform_thread_wrap(void*) () #8 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #9 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 36 (Thread 0x7f038e7fc700 (LWP 35657)): #0 0x00007f0399aa9e63 in epoll_wait () from /lib64/libc.so.6 #1 0x00007f039b4513c5 in epoll_dispatch () from /opt/couchbase/bin/../lib/libevent_core-2.1.so.7 #2 0x00007f039b448376 in event_base_loop () from /opt/couchbase/bin/../lib/libevent_core-2.1.so.7 #3 0x0000000000b63217 in folly::EventBase::loopBody(int, bool) () #4 0x0000000000b636e6 in folly::EventBase::loop() () #5 0x0000000000b650a6 in folly::EventBase::loopForever() () #6 0x00000000005a3549 in worker_libevent(void*) () #7 0x0000000000af6fe9 in platform_thread_wrap(void*) () #8 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #9 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 35 (Thread 0x7f038dffb700 (LWP 35658)): #0 0x00007f0399aa9e63 in epoll_wait () from /lib64/libc.so.6 #1 0x00007f039b4513c5 in epoll_dispatch () from /opt/couchbase/bin/../lib/libevent_core-2.1.so.7 #2 0x00007f039b448376 in event_base_loop () from /opt/couchbase/bin/../lib/libevent_core-2.1.so.7 #3 0x0000000000b63217 in folly::EventBase::loopBody(int, bool) () #4 0x0000000000b636e6 in folly::EventBase::loop() () #5 0x0000000000b650a6 in folly::EventBase::loopForever() () #6 0x00000000005a3549 in worker_libevent(void*) () #7 0x0000000000af6fe9 in platform_thread_wrap(void*) () #8 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #9 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 34 (Thread 0x7f0387fff700 (LWP 35659)): #0 0x00007f0399aa3ba9 in syscall () from /lib64/libc.so.6 #1 0x0000000000b3960d in folly::detail::futexWaitImpl(std::atomic const*, unsigned int, std::chrono::time_point > > const*, std::chrono::time_point > > const*, unsigned int) () #2 0x0000000000b436da in folly::detail::FutexResult folly::detail::futexWaitUntil, std::chrono::_V2::steady_clock, std::chrono::duration > >(std::atomic const*, unsigned int, std::chrono::time_point > > const&, unsigned int) () #3 0x0000000000b49677 in bool folly::SaturatingSemaphore::tryWaitSlow > >(std::chrono::time_point > > const&, folly::WaitOptions const&) () #4 0x0000000000b49ba6 in bool folly::detail::LifoSemBase, std::atomic>::try_wait_until > >(std::chrono::time_point > > const&) () #5 0x0000000000b4d300 in folly::UnboundedBlockingQueue::try_take_for(std::chrono::duration >) () #6 0x0000000000b41877 in folly::CPUThreadPoolExecutor::threadRun(std::shared_ptr) () #7 0x0000000000b5cae9 in void folly::detail::function::FunctionTraits::callBig))(std::shared_ptr)> >(folly::detail::function::Data&) () #8 0x0000000000a07c04 in void folly::detail::function::FunctionTraits::callBig&&)::{lambda()#1}>(folly::detail::function::Data&) () #9 0x00007f039a360d40 in execute_native_thread_routine () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/thread.cc:80 #10 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #11 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 33 (Thread 0x7f03877fe700 (LWP 35660)): #0 0x00007f0399aa3ba9 in syscall () from /lib64/libc.so.6 #1 0x0000000000b3960d in folly::detail::futexWaitImpl(std::atomic const*, unsigned int, std::chrono::time_point > > const*, std::chrono::time_point > > const*, unsigned int) () #2 0x0000000000b436da in folly::detail::FutexResult folly::detail::futexWaitUntil, std::chrono::_V2::steady_clock, std::chrono::duration > >(std::atomic const*, unsigned int, std::chrono::time_point > > const&, unsigned int) () #3 0x0000000000b49677 in bool folly::SaturatingSemaphore::tryWaitSlow > >(std::chrono::time_point > > const&, folly::WaitOptions const&) () #4 0x0000000000b49ba6 in bool folly::detail::LifoSemBase, std::atomic>::try_wait_until > >(std::chrono::time_point > > const&) () #5 0x0000000000b4d300 in folly::UnboundedBlockingQueue::try_take_for(std::chrono::duration >) () #6 0x0000000000b41877 in folly::CPUThreadPoolExecutor::threadRun(std::shared_ptr) () #7 0x0000000000b5cae9 in void folly::detail::function::FunctionTraits::callBig))(std::shared_ptr)> >(folly::detail::function::Data&) () #8 0x0000000000a07c04 in void folly::detail::function::FunctionTraits::callBig&&)::{lambda()#1}>(folly::detail::function::Data&) () #9 0x00007f039a360d40 in execute_native_thread_routine () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/thread.cc:80 #10 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #11 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 32 (Thread 0x7f0386ffd700 (LWP 35661)): #0 0x00007f0399aa3ba9 in syscall () from /lib64/libc.so.6 #1 0x0000000000b3960d in folly::detail::futexWaitImpl(std::atomic const*, unsigned int, std::chrono::time_point > > const*, std::chrono::time_point > > const*, unsigned int) () #2 0x0000000000b436da in folly::detail::FutexResult folly::detail::futexWaitUntil, std::chrono::_V2::steady_clock, std::chrono::duration > >(std::atomic const*, unsigned int, std::chrono::time_point > > const&, unsigned int) () #3 0x0000000000b49677 in bool folly::SaturatingSemaphore::tryWaitSlow > >(std::chrono::time_point > > const&, folly::WaitOptions const&) () #4 0x0000000000b49ba6 in bool folly::detail::LifoSemBase, std::atomic>::try_wait_until > >(std::chrono::time_point > > const&) () #5 0x0000000000b4d300 in folly::UnboundedBlockingQueue::try_take_for(std::chrono::duration >) () #6 0x0000000000b41877 in folly::CPUThreadPoolExecutor::threadRun(std::shared_ptr) () #7 0x0000000000b5cae9 in void folly::detail::function::FunctionTraits::callBig))(std::shared_ptr)> >(folly::detail::function::Data&) () #8 0x0000000000a07c04 in void folly::detail::function::FunctionTraits::callBig&&)::{lambda()#1}>(folly::detail::function::Data&) () #9 0x00007f039a360d40 in execute_native_thread_routine () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/thread.cc:80 #10 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #11 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 31 (Thread 0x7f0384fd7700 (LWP 35662)): #0 0x00007f0399aa3ba9 in syscall () from /lib64/libc.so.6 #1 0x0000000000b3960d in folly::detail::futexWaitImpl(std::atomic const*, unsigned int, std::chrono::time_point > > const*, std::chrono::time_point > > const*, unsigned int) () #2 0x0000000000b436da in folly::detail::FutexResult folly::detail::futexWaitUntil, std::chrono::_V2::steady_clock, std::chrono::duration > >(std::atomic const*, unsigned int, std::chrono::time_point > > const&, unsigned int) () #3 0x0000000000b49677 in bool folly::SaturatingSemaphore::tryWaitSlow > >(std::chrono::time_point > > const&, folly::WaitOptions const&) () #4 0x0000000000b49ba6 in bool folly::detail::LifoSemBase, std::atomic>::try_wait_until > >(std::chrono::time_point > > const&) () #5 0x0000000000b4d300 in folly::UnboundedBlockingQueue::try_take_for(std::chrono::duration >) () #6 0x0000000000b41877 in folly::CPUThreadPoolExecutor::threadRun(std::shared_ptr) () #7 0x0000000000b5cae9 in void folly::detail::function::FunctionTraits::callBig))(std::shared_ptr)> >(folly::detail::function::Data&) () #8 0x0000000000a07c04 in void folly::detail::function::FunctionTraits::callBig&&)::{lambda()#1}>(folly::detail::function::Data&) () #9 0x00007f039a360d40 in execute_native_thread_routine () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/thread.cc:80 #10 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #11 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 30 (Thread 0x7f0377fff700 (LWP 35663)): #0 0x00007f0399aa3ba9 in syscall () from /lib64/libc.so.6 #1 0x0000000000b3960d in folly::detail::futexWaitImpl(std::atomic const*, unsigned int, std::chrono::time_point > > const*, std::chrono::time_point > > const*, unsigned int) () #2 0x0000000000b436da in folly::detail::FutexResult folly::detail::futexWaitUntil, std::chrono::_V2::steady_clock, std::chrono::duration > >(std::atomic const*, unsigned int, std::chrono::time_point > > const&, unsigned int) () #3 0x0000000000b49677 in bool folly::SaturatingSemaphore::tryWaitSlow > >(std::chrono::time_point > > const&, folly::WaitOptions const&) () #4 0x0000000000b49ba6 in bool folly::detail::LifoSemBase, std::atomic>::try_wait_until > >(std::chrono::time_point > > const&) () #5 0x0000000000b4d300 in folly::UnboundedBlockingQueue::try_take_for(std::chrono::duration >) () #6 0x0000000000b41877 in folly::CPUThreadPoolExecutor::threadRun(std::shared_ptr) () #7 0x0000000000b5cae9 in void folly::detail::function::FunctionTraits::callBig))(std::shared_ptr)> >(folly::detail::function::Data&) () #8 0x0000000000a07c04 in void folly::detail::function::FunctionTraits::callBig&&)::{lambda()#1}>(folly::detail::function::Data&) () #9 0x00007f039a360d40 in execute_native_thread_routine () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/thread.cc:80 #10 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #11 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 29 (Thread 0x7f03777fe700 (LWP 35664)): #0 0x00007f0399aa3ba9 in syscall () from /lib64/libc.so.6 #1 0x0000000000b3960d in folly::detail::futexWaitImpl(std::atomic const*, unsigned int, std::chrono::time_point > > const*, std::chrono::time_point > > const*, unsigned int) () #2 0x0000000000b436da in folly::detail::FutexResult folly::detail::futexWaitUntil, std::chrono::_V2::steady_clock, std::chrono::duration > >(std::atomic const*, unsigned int, std::chrono::time_point > > const&, unsigned int) () #3 0x0000000000b49677 in bool folly::SaturatingSemaphore::tryWaitSlow > >(std::chrono::time_point > > const&, folly::WaitOptions const&) () #4 0x0000000000b49ba6 in bool folly::detail::LifoSemBase, std::atomic>::try_wait_until > >(std::chrono::time_point > > const&) () #5 0x0000000000b4d300 in folly::UnboundedBlockingQueue::try_take_for(std::chrono::duration >) () #6 0x0000000000b41877 in folly::CPUThreadPoolExecutor::threadRun(std::shared_ptr) () #7 0x0000000000b5cae9 in void folly::detail::function::FunctionTraits::callBig))(std::shared_ptr)> >(folly::detail::function::Data&) () #8 0x0000000000a07c04 in void folly::detail::function::FunctionTraits::callBig&&)::{lambda()#1}>(folly::detail::function::Data&) () #9 0x00007f039a360d40 in execute_native_thread_routine () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/thread.cc:80 #10 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #11 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 28 (Thread 0x7f0376ffd700 (LWP 35665)): #0 0x00007f0399aa3ba9 in syscall () from /lib64/libc.so.6 #1 0x0000000000b3960d in folly::detail::futexWaitImpl(std::atomic const*, unsigned int, std::chrono::time_point > > const*, std::chrono::time_point > > const*, unsigned int) () #2 0x0000000000b436da in folly::detail::FutexResult folly::detail::futexWaitUntil, std::chrono::_V2::steady_clock, std::chrono::duration > >(std::atomic const*, unsigned int, std::chrono::time_point > > const&, unsigned int) () #3 0x0000000000b49677 in bool folly::SaturatingSemaphore::tryWaitSlow > >(std::chrono::time_point > > const&, folly::WaitOptions const&) () #4 0x0000000000b49ba6 in bool folly::detail::LifoSemBase, std::atomic>::try_wait_until > >(std::chrono::time_point > > const&) () #5 0x0000000000b4d300 in folly::UnboundedBlockingQueue::try_take_for(std::chrono::duration >) () #6 0x0000000000b41877 in folly::CPUThreadPoolExecutor::threadRun(std::shared_ptr) () #7 0x0000000000b5cae9 in void folly::detail::function::FunctionTraits::callBig))(std::shared_ptr)> >(folly::detail::function::Data&) () #8 0x0000000000a07c04 in void folly::detail::function::FunctionTraits::callBig&&)::{lambda()#1}>(folly::detail::function::Data&) () #9 0x00007f039a360d40 in execute_native_thread_routine () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/thread.cc:80 #10 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #11 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 27 (Thread 0x7f03767fc700 (LWP 35666)): #0 0x00007f0399aa3ba9 in syscall () from /lib64/libc.so.6 #1 0x0000000000b3960d in folly::detail::futexWaitImpl(std::atomic const*, unsigned int, std::chrono::time_point > > const*, std::chrono::time_point > > const*, unsigned int) () #2 0x0000000000b436da in folly::detail::FutexResult folly::detail::futexWaitUntil, std::chrono::_V2::steady_clock, std::chrono::duration > >(std::atomic const*, unsigned int, std::chrono::time_point > > const&, unsigned int) () #3 0x0000000000b49677 in bool folly::SaturatingSemaphore::tryWaitSlow > >(std::chrono::time_point > > const&, folly::WaitOptions const&) () #4 0x0000000000b49ba6 in bool folly::detail::LifoSemBase, std::atomic>::try_wait_until > >(std::chrono::time_point > > const&) () #5 0x0000000000b4d300 in folly::UnboundedBlockingQueue::try_take_for(std::chrono::duration >) () #6 0x0000000000b41877 in folly::CPUThreadPoolExecutor::threadRun(std::shared_ptr) () #7 0x0000000000b5cae9 in void folly::detail::function::FunctionTraits::callBig))(std::shared_ptr)> >(folly::detail::function::Data&) () #8 0x0000000000a07c04 in void folly::detail::function::FunctionTraits::callBig&&)::{lambda()#1}>(folly::detail::function::Data&) () #9 0x00007f039a360d40 in execute_native_thread_routine () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/thread.cc:80 #10 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #11 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 26 (Thread 0x7f0375ffb700 (LWP 35667)): #0 0x00007f0399aa3ba9 in syscall () from /lib64/libc.so.6 #1 0x0000000000b3960d in folly::detail::futexWaitImpl(std::atomic const*, unsigned int, std::chrono::time_point > > const*, std::chrono::time_point > > const*, unsigned int) () #2 0x0000000000b436da in folly::detail::FutexResult folly::detail::futexWaitUntil, std::chrono::_V2::steady_clock, std::chrono::duration > >(std::atomic const*, unsigned int, std::chrono::time_point > > const&, unsigned int) () #3 0x0000000000b49677 in bool folly::SaturatingSemaphore::tryWaitSlow > >(std::chrono::time_point > > const&, folly::WaitOptions const&) () #4 0x0000000000b49ba6 in bool folly::detail::LifoSemBase, std::atomic>::try_wait_until > >(std::chrono::time_point > > const&) () #5 0x0000000000b4d300 in folly::UnboundedBlockingQueue::try_take_for(std::chrono::duration >) () #6 0x0000000000b41877 in folly::CPUThreadPoolExecutor::threadRun(std::shared_ptr) () #7 0x0000000000b5cae9 in void folly::detail::function::FunctionTraits::callBig))(std::shared_ptr)> >(folly::detail::function::Data&) () #8 0x0000000000a07c04 in void folly::detail::function::FunctionTraits::callBig&&)::{lambda()#1}>(folly::detail::function::Data&) () #9 0x00007f039a360d40 in execute_native_thread_routine () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/thread.cc:80 #10 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #11 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 25 (Thread 0x7f03757fa700 (LWP 35668)): #0 0x00007f039c1849f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f039a35b8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x000000000092e08c in magma::WaitGroup::Wait() () #4 0x00000000009b1bfd in magma::SSTableLoader::Load() () #5 0x00000000009aac2c in magma::TreeSnapshot::decode(std::__cxx11::basic_string, std::allocator > const&, magma::TableLoader&, std::function > ()>) () #6 0x00000000009ab846 in magma::TreeSnapshot::Read(unsigned long, magma::TableLoader&, std::function > ()>) () #7 0x00000000009917f0 in magma::LSMTree::loadCheckpoints(std::vector > const&, std::vector >&) () #8 0x0000000000982ce9 in magma::LSMTree::Open(std::vector > const&) () #9 0x0000000000923b96 in magma::KVStore::Open() () #10 0x00000000008c87ef in magma::Magma::Impl::createKVStore(unsigned short, unsigned int, std::shared_ptr&)::{lambda()#4}::operator()() const () #11 0x00000000008c8ced in std::_Function_handler > (), magma::Magma::Impl::createKVStore(unsigned short, unsigned int, std::shared_ptr&)::{lambda()#4}>::_M_invoke(std::_Any_data const&) () #12 0x00000000008fb1e3 in magma::KVStoreSet::CreateKVStore(std::function > ()>) () #13 0x00000000008ce541 in magma::Magma::Impl::createKVStore(unsigned short, unsigned int, std::shared_ptr&) () #14 0x0000000000926ef3 in magma::Magma::Impl::recovery() () #15 0x00000000008c769f in magma::Magma::Impl::Open() () #16 0x00000000008c784d in magma::Magma::Open() () #17 0x0000000000858cd1 in MagmaMemoryTrackingProxy::Open() () #18 0x000000000084e691 in MagmaKVStore::MagmaKVStore(MagmaKVStoreConfig&) () #19 0x00000000007ffc4b in KVStoreFactory::create(KVStoreConfig&) () #20 0x00000000006f13cb in KVShard::KVShard(EventuallyPersistentEngine&, unsigned short) () #21 0x0000000000720b3f in VBucketMap::VBucketMap(Configuration&, KVBucket&) () #22 0x00000000006e44fb in KVBucket::KVBucket(EventuallyPersistentEngine&) () #23 0x00000000007e92f6 in EPBucket::EPBucket(EventuallyPersistentEngine&) () #24 0x0000000000697188 in EventuallyPersistentEngine::makeBucket(Configuration&) () #25 0x000000000069c735 in EventuallyPersistentEngine::initialize(char const*) () #26 0x00000000005b4635 in BucketManager::create(Cookie&, std::__cxx11::basic_string, std::allocator >, std::__cxx11::basic_string, std::allocator >, BucketType) () #27 0x000000000061e60e in std::_Function_handler::_M_invoke(std::_Any_data const&) () #28 0x000000000061cbc4 in OneShotTask::run() () #29 0x0000000000a0ae52 in GlobalTask::execute() () #30 0x0000000000a07f75 in FollyExecutorPool::TaskProxy::scheduleViaCPUPool()::{lambda()#2}::operator()() const () #31 0x0000000000b59b30 in folly::ThreadPoolExecutor::runTask(std::shared_ptr const&, folly::ThreadPoolExecutor::Task&&) () #32 0x0000000000b418ea in folly::CPUThreadPoolExecutor::threadRun(std::shared_ptr) () #33 0x0000000000b5cae9 in void folly::detail::function::FunctionTraits::callBig))(std::shared_ptr)> >(folly::detail::function::Data&) () #34 0x0000000000a07c04 in void folly::detail::function::FunctionTraits::callBig&&)::{lambda()#1}>(folly::detail::function::Data&) () #35 0x00007f039a360d40 in execute_native_thread_routine () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/thread.cc:80 #36 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #37 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 24 (Thread 0x7f0374ff9700 (LWP 35669)): #0 0x00007f0399aa3ba9 in syscall () from /lib64/libc.so.6 #1 0x0000000000b3960d in folly::detail::futexWaitImpl(std::atomic const*, unsigned int, std::chrono::time_point > > const*, std::chrono::time_point > > const*, unsigned int) () #2 0x0000000000b436da in folly::detail::FutexResult folly::detail::futexWaitUntil, std::chrono::_V2::steady_clock, std::chrono::duration > >(std::atomic const*, unsigned int, std::chrono::time_point > > const&, unsigned int) () #3 0x0000000000b49677 in bool folly::SaturatingSemaphore::tryWaitSlow > >(std::chrono::time_point > > const&, folly::WaitOptions const&) () #4 0x0000000000b49ba6 in bool folly::detail::LifoSemBase, std::atomic>::try_wait_until > >(std::chrono::time_point > > const&) () #5 0x0000000000b4d300 in folly::UnboundedBlockingQueue::try_take_for(std::chrono::duration >) () #6 0x0000000000b41877 in folly::CPUThreadPoolExecutor::threadRun(std::shared_ptr) () #7 0x0000000000b5cae9 in void folly::detail::function::FunctionTraits::callBig))(std::shared_ptr)> >(folly::detail::function::Data&) () #8 0x0000000000a07c04 in void folly::detail::function::FunctionTraits::callBig&&)::{lambda()#1}>(folly::detail::function::Data&) () #9 0x00007f039a360d40 in execute_native_thread_routine () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/thread.cc:80 #10 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #11 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 23 (Thread 0x7f0353fff700 (LWP 35670)): #0 0x00007f0399aa9e63 in epoll_wait () from /lib64/libc.so.6 #1 0x00007f039b4513c5 in epoll_dispatch () from /opt/couchbase/bin/../lib/libevent_core-2.1.so.7 #2 0x00007f039b448376 in event_base_loop () from /opt/couchbase/bin/../lib/libevent_core-2.1.so.7 #3 0x0000000000b63217 in folly::EventBase::loopBody(int, bool) () #4 0x0000000000b636e6 in folly::EventBase::loop() () #5 0x0000000000b650a6 in folly::EventBase::loopForever() () #6 0x0000000000b51359 in folly::IOThreadPoolExecutor::threadRun(std::shared_ptr) () #7 0x0000000000b5cae9 in void folly::detail::function::FunctionTraits::callBig))(std::shared_ptr)> >(folly::detail::function::Data&) () #8 0x00007f039a360d40 in execute_native_thread_routine () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/thread.cc:80 #9 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #10 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 22 (Thread 0x7f03537fe700 (LWP 35672)): #0 0x00007f039c184da2 in pthread_cond_timedwait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f039b689e51 in background_thread_entry () from /opt/couchbase/bin/../lib/libjemalloc.so.2 #2 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #3 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 21 (Thread 0x7f0352ffd700 (LWP 35673)): #0 0x00007f039c184da2 in pthread_cond_timedwait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x000000000094784f in magma::TaskQueue::dequeue(std::atomic&) () #2 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #3 0x0000000000af6fe9 in platform_thread_wrap(void*) () #4 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #5 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 20 (Thread 0x7f03527fc700 (LWP 35674)): #0 0x00007f039c184da2 in pthread_cond_timedwait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x000000000094784f in magma::TaskQueue::dequeue(std::atomic&) () #2 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #3 0x0000000000af6fe9 in platform_thread_wrap(void*) () #4 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #5 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 19 (Thread 0x7f0351ffb700 (LWP 35675)): #0 0x00007f039c184da2 in pthread_cond_timedwait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x000000000094784f in magma::TaskQueue::dequeue(std::atomic&) () #2 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #3 0x0000000000af6fe9 in platform_thread_wrap(void*) () #4 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #5 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 18 (Thread 0x7f03517fa700 (LWP 35676)): #0 0x00007f039c184da2 in pthread_cond_timedwait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x000000000094784f in magma::TaskQueue::dequeue(std::atomic&) () #2 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #3 0x0000000000af6fe9 in platform_thread_wrap(void*) () #4 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #5 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 17 (Thread 0x7f0350ff9700 (LWP 35677)): #0 0x00007f039c1849f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f039a35b8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x00000000009475b8 in magma::TaskQueue::dequeue(std::atomic&) () #4 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #5 0x0000000000af6fe9 in platform_thread_wrap(void*) () #6 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #7 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 16 (Thread 0x7f033bfff700 (LWP 35678)): #0 0x00007f039c1849f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f039a35b8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x00000000009475b8 in magma::TaskQueue::dequeue(std::atomic&) () #4 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #5 0x0000000000af6fe9 in platform_thread_wrap(void*) () #6 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #7 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 15 (Thread 0x7f033b7fe700 (LWP 35679)): #0 0x00007f039c1849f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f039a35b8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x00000000009475b8 in magma::TaskQueue::dequeue(std::atomic&) () #4 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #5 0x0000000000af6fe9 in platform_thread_wrap(void*) () #6 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #7 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 14 (Thread 0x7f033affd700 (LWP 35680)): #0 0x00007f039c1849f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f039a35b8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x00000000009475b8 in magma::TaskQueue::dequeue(std::atomic&) () #4 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #5 0x0000000000af6fe9 in platform_thread_wrap(void*) () #6 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #7 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 13 (Thread 0x7f0338e56700 (LWP 35682)): #0 0x00007f039c1849f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f039a35b8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x00000000009475b8 in magma::TaskQueue::dequeue(std::atomic&) () #4 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #5 0x0000000000af6fe9 in platform_thread_wrap(void*) () #6 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #7 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 12 (Thread 0x7f0333fff700 (LWP 35683)): #0 0x00007f039c1849f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f039a35b8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x00000000009475b8 in magma::TaskQueue::dequeue(std::atomic&) () #4 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #5 0x0000000000af6fe9 in platform_thread_wrap(void*) () #6 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #7 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 11 (Thread 0x7f03337fe700 (LWP 35684)): #0 0x00007f039c1849f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f039a35b8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x00000000009475b8 in magma::TaskQueue::dequeue(std::atomic&) () #4 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #5 0x0000000000af6fe9 in platform_thread_wrap(void*) () #6 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #7 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 10 (Thread 0x7f0332ffd700 (LWP 35685)): #0 0x00007f039c1849f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f039a35b8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x00000000009475b8 in magma::TaskQueue::dequeue(std::atomic&) () #4 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #5 0x0000000000af6fe9 in platform_thread_wrap(void*) () #6 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #7 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 9 (Thread 0x7f03327fc700 (LWP 35686)): #0 0x00007f039c1849f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f039a35b8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x00000000009475b8 in magma::TaskQueue::dequeue(std::atomic&) () #4 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #5 0x0000000000af6fe9 in platform_thread_wrap(void*) () #6 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #7 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 8 (Thread 0x7f0331ffb700 (LWP 35687)): #0 0x00007f039c1849f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f039a35b8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x00000000009475b8 in magma::TaskQueue::dequeue(std::atomic&) () #4 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #5 0x0000000000af6fe9 in platform_thread_wrap(void*) () #6 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #7 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 7 (Thread 0x7f03317fa700 (LWP 35688)): #0 0x00007f039c1849f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f039a35b8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x00000000009475b8 in magma::TaskQueue::dequeue(std::atomic&) () #4 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #5 0x0000000000af6fe9 in platform_thread_wrap(void*) () #6 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #7 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 6 (Thread 0x7f0330ff9700 (LWP 35689)): #0 0x00007f039c1849f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f039a35b8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x00000000009475b8 in magma::TaskQueue::dequeue(std::atomic&) () #4 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #5 0x0000000000af6fe9 in platform_thread_wrap(void*) () #6 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #7 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 5 (Thread 0x7f03307f8700 (LWP 35690)): #0 0x00007f039c1849f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f039a35b8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x00000000009475b8 in magma::TaskQueue::dequeue(std::atomic&) () #4 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #5 0x0000000000af6fe9 in platform_thread_wrap(void*) () #6 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #7 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 4 (Thread 0x7f032fff7700 (LWP 35691)): #0 0x00007f039c1849f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f039a35b8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x00000000009475b8 in magma::TaskQueue::dequeue(std::atomic&) () #4 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #5 0x0000000000af6fe9 in platform_thread_wrap(void*) () #6 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #7 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 3 (Thread 0x7f032f7f6700 (LWP 35692)): #0 0x00007f039c1849f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f039a35b8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x00000000009475b8 in magma::TaskQueue::dequeue(std::atomic&) () #4 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #5 0x0000000000af6fe9 in platform_thread_wrap(void*) () #6 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #7 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 2 (Thread 0x7f032eff5700 (LWP 35693)): #0 0x00007f039c1849f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f039a35b8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x00000000009475b8 in magma::TaskQueue::dequeue(std::atomic&) () #4 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #5 0x0000000000af6fe9 in platform_thread_wrap(void*) () #6 0x00007f039c180e65 in start_thread () from /lib64/libpthread.so.0 #7 0x00007f0399aa988d in clone () from /lib64/libc.so.6 Thread 1 (Thread 0x7f039cff6c40 (LWP 35642)): #0 0x00007f0399aa9e63 in epoll_wait () from /lib64/libc.so.6 #1 0x00007f039b4513c5 in epoll_dispatch () from /opt/couchbase/bin/../lib/libevent_core-2.1.so.7 #2 0x00007f039b448376 in event_base_loop () from /opt/couchbase/bin/../lib/libevent_core-2.1.so.7 #3 0x0000000000b63217 in folly::EventBase::loopBody(int, bool) () #4 0x0000000000b636e6 in folly::EventBase::loop() () #5 0x0000000000b650a6 in folly::EventBase::loopForever() () #6 0x000000000055a2a8 in memcached_main(int, char**) () #7 0x00007f03999cd505 in __libc_start_main () from /lib64/libc.so.6 #8 0x0000000000552a81 in _start () Detaching from program: /opt/couchbase/bin/memcached, process 35642 2021-09-14 02:44:09,440 | test | CRITICAL | MainThread | [basetestcase:check_coredump_exist:905] 172.23.123.119: Found ' CRITICAL ' logs - ['memcached<0.129.0>: 2021-09-14T02:44:02.768350-07:00 CRITICAL *** Fatal error encountered during exception handling ***\n', 'memcached<0.129.0>: 2021-09-14T02:44:02.768376-07:00 CRITICAL Caught unhandled std::exception-derived exception. what(): Monotonic (unlabelled) invariant failed: new value (0) breaks invariant on current value (6329)\n', 'memcached<0.129.0>: 2021-09-14T02:44:02.768378-07:00 CRITICAL Exception thrown from:\n', 'memcached<0.129.0>: 2021-09-14T02:44:02.768415-07:00 CRITICAL #0 /opt/couchbase/bin/memcached() [0x400000+0x95207]\n', 'memcached<0.129.0>: 2021-09-14T02:44:02.768439-07:00 CRITICAL #1 /opt/couchbase/bin/memcached() [0x400000+0xc3c47]\n', 'memcached<0.129.0>: 2021-09-14T02:44:02.768458-07:00 CRITICAL #2 /opt/couchbase/bin/memcached() [0x400000+0x3ef06d]\n', 'memcached<0.129.0>: 2021-09-14T02:44:02.768478-07:00 CRITICAL #3 /opt/couchbase/bin/memcached() [0x400000+0x4524f2]\n', 'memcached<0.129.0>: 2021-09-14T02:44:02.768493-07:00 CRITICAL #4 /opt/couchbase/bin/memcached() [0x400000+0x452f76]\n', 'memcached<0.129.0>: 2021-09-14T02:44:02.768506-07:00 CRITICAL #5 /opt/couchbase/bin/memcached() [0x400000+0x3eb3ba]\n', 'memcached<0.129.0>: 2021-09-14T02:44:02.768519-07:00 CRITICAL #6 /opt/couchbase/bin/memcached() [0x400000+0x3ec981]\n', 'memcached<0.129.0>: 2021-09-14T02:44:02.768532-07:00 CRITICAL #7 /opt/couchbase/bin/memcached() [0x400000+0x306666]\n', 'memcached<0.129.0>: 2021-09-14T02:44:02.768552-07:00 CRITICAL #8 /opt/couchbase/bin/memcached() [0x400000+0x60ae52]\n', 'memcached<0.129.0>: 2021-09-14T02:44:02.768571-07:00 CRITICAL #9 /opt/couchbase/bin/memcached() [0x400000+0x607f75]\n', 'memcached<0.129.0>: 2021-09-14T02:44:02.768594-07:00 CRITICAL #10 /opt/couchbase/bin/memcached() [0x400000+0x759b30]\n', 'memcached<0.129.0>: 2021-09-14T02:44:02.768614-07:00 CRITICAL #11 /opt/couchbase/bin/memcached() [0x400000+0x7418ea]\n', 'memcached<0.129.0>: 2021-09-14T02:44:02.768634-07:00 CRITICAL #12 /opt/couchbase/bin/memcached() [0x400000+0x75cae9]\n', 'memcached<0.129.0>: 2021-09-14T02:44:02.768653-07:00 CRITICAL #13 /opt/couchbase/bin/memcached() [0x400000+0x607c04]\n', 'memcached<0.129.0>: 2021-09-14T02:44:02.768694-07:00 CRITICAL #14 /opt/couchbase/bin/../lib/libstdc++.so.6() [0x7f7d3cf37000+0xcdd40]\n', 'memcached<0.129.0>: 2021-09-14T02:44:02.768700-07:00 CRITICAL #15 /lib64/libpthread.so.0() [0x7f7d3ee1d000+0x7e65]\n', 'memcached<0.129.0>: 2021-09-14T02:44:02.768732-07:00 CRITICAL #16 /lib64/libc.so.6(clone+0x6d) [0x7f7d3c64f000+0xfe88d]\n', '[ns_server:info,2021-09-14T02:44:03.385-07:00,babysitter_of_ns_1@cb.local:<0.129.0>:ns_port_server:log:221]memcached<0.129.0>: CRITICAL Breakpad caught a crash (Couchbase version 7.1.0-1277). Writing crash dump to /opt/couchbase/var/lib/couchbase/crash/f344d4be-47fa-4521-4c835ca8-226ee566.dmp before terminating.\n', 'memcached<0.426.0>: 2021-09-14T02:44:03.429780-07:00 CRITICAL Detected previous crash\n', 'memcached<0.426.0>: 2021-09-14T02:44:03.429826-07:00 CRITICAL Breakpad caught a crash (Couchbase version 7.1.0-1277). Writing crash dump to /opt/couchbase/var/lib/couchbase/crash/f344d4be-47fa-4521-4c835ca8-226ee566.dmp before terminating.\n', 'memcached<0.426.0>: 2021-09-14T02:44:03.429837-07:00 CRITICAL Stack backtrace of crashed thread:\n', 'memcached<0.426.0>: 2021-09-14T02:44:03.429839-07:00 CRITICAL #0 /opt/couchbase/bin/memcached() [0x400000+0x68c438]\n', 'memcached<0.426.0>: 2021-09-14T02:44:03.429840-07:00 CRITICAL #1 /opt/couchbase/bin/memcached(_ZN15google_breakpad16ExceptionHandler12GenerateDumpEPNS0_12CrashContextE+0x3ea) [0x400000+0x6e1a4a]\n', 'memcached<0.426.0>: 2021-09-14T02:44:03.429841-07:00 CRITICAL #2 /opt/couchbase/bin/memcached(_ZN15google_breakpad16ExceptionHandler13SignalHandlerEiP9siginfo_tPv+0xb8) [0x400000+0x6e1d88]\n', 'memcached<0.426.0>: 2021-09-14T02:44:03.429843-07:00 CRITICAL #3 /lib64/libpthread.so.0() [0x7f7d3ee1d000+0xf5f0]\n', 'memcached<0.426.0>: 2021-09-14T02:44:03.429844-07:00 CRITICAL #4 /lib64/libc.so.6(gsignal+0x37) [0x7f7d3c64f000+0x36337]\n', 'memcached<0.426.0>: 2021-09-14T02:44:03.429845-07:00 CRITICAL #5 /lib64/libc.so.6(abort+0x148) [0x7f7d3c64f000+0x37a28]\n', 'memcached<0.426.0>: 2021-09-14T02:44:03.429846-07:00 CRITICAL #6 /opt/couchbase/bin/../lib/libstdc++.so.6() [0x7f7d3cf37000+0x9963c]\n', 'memcached<0.426.0>: 2021-09-14T02:44:03.429848-07:00 CRITICAL #7 /opt/couchbase/bin/memcached() [0x400000+0x699a3b]\n', 'memcached<0.426.0>: 2021-09-14T02:44:03.429849-07:00 CRITICAL #8 /opt/couchbase/bin/../lib/libstdc++.so.6() [0x7f7d3cf37000+0xa48f6]\n', 'memcached<0.426.0>: 2021-09-14T02:44:03.429850-07:00 CRITICAL #9 /opt/couchbase/bin/../lib/libstdc++.so.6() [0x7f7d3cf37000+0xa4961]\n', 'memcached<0.426.0>: 2021-09-14T02:44:03.429851-07:00 CRITICAL #10 /opt/couchbase/bin/../lib/libstdc++.so.6(__cxa_rethrow+0x46) [0x7f7d3cf37000+0xa4c46]\n', 'memcached<0.426.0>: 2021-09-14T02:44:03.429880-07:00 CRITICAL #11 /opt/couchbase/bin/memcached() [0x400000+0xc3efe]\n', 'memcached<0.426.0>: 2021-09-14T02:44:03.429903-07:00 CRITICAL #12 /opt/couchbase/bin/memcached() [0x400000+0x4524f2]\n', 'memcached<0.426.0>: 2021-09-14T02:44:03.429905-07:00 CRITICAL #13 /opt/couchbase/bin/memcached() [0x400000+0x452f76]\n', 'memcached<0.426.0>: 2021-09-14T02:44:03.429907-07:00 CRITICAL #14 /opt/couchbase/bin/memcached() [0x400000+0x3eb3ba]\n', 'memcached<0.426.0>: 2021-09-14T02:44:03.429935-07:00 CRITICAL #15 /opt/couchbase/bin/memcached() [0x400000+0x3ec981]\n', 'memcached<0.426.0>: 2021-09-14T02:44:03.429937-07:00 CRITICAL #16 /opt/couchbase/bin/memcached() [0x400000+0x306666]\n', 'memcached<0.426.0>: 2021-09-14T02:44:03.429938-07:00 CRITICAL #17 /opt/couchbase/bin/memcached() [0x400000+0x60ae52]\n', 'memcached<0.426.0>: 2021-09-14T02:44:03.429960-07:00 CRITICAL #18 /opt/couchbase/bin/memcached() [0x400000+0x607f75]\n', 'memcached<0.426.0>: 2021-09-14T02:44:03.429961-07:00 CRITICAL #19 /opt/couchbase/bin/memcached() [0x400000+0x759b30]\n', 'memcached<0.426.0>: 2021-09-14T02:44:03.429962-07:00 CRITICAL #20 /opt/couchbase/bin/memcached() [0x400000+0x7418ea]\n', 'memcached<0.426.0>: 2021-09-14T02:44:03.429984-07:00 CRITICAL #21 /opt/couchbase/bin/memcached() [0x400000+0x75cae9]\n', 'memcached<0.426.0>: 2021-09-14T02:44:03.430004-07:00 CRITICAL #22 /opt/couchbase/bin/memcached() [0x400000+0x607c04]\n', 'memcached<0.426.0>: 2021-09-14T02:44:03.430024-07:00 CRITICAL #23 /opt/couchbase/bin/../lib/libstdc++.so.6() [0x7f7d3cf37000+0xcdd40]\n', 'memcached<0.426.0>: 2021-09-14T02:44:03.430028-07:00 CRITICAL #24 /lib64/libpthread.so.0() [0x7f7d3ee1d000+0x7e65]\n', 'memcached<0.426.0>: 2021-09-14T02:44:03.430030-07:00 CRITICAL #25 /lib64/libc.so.6(clone+0x6d) [0x7f7d3c64f000+0xfe88d]\n'] ERROR ====================================================================== During the test, ERROR: test_data_load_collections_with_hard_failover_recovery (bucket_collections.collections_rebalance.CollectionsRebalance) Remote Connections: 90, Disconnections: 90 SDK Connections: 38, Disconnections: 38 ---------------------------------------------------------------------- Traceback (most recent call last): File "pytests/bucket_collections/collections_rebalance.py", line 95, in tearDown super(CollectionsRebalance, self).tearDown() File "pytests/bucket_collections/collections_base.py", line 92, in tearDown super(CollectionBase, self).tearDown() File "pytests/basetestcase.py", line 1051, in tearDown super(ClusterSetup, self).tearDown() File "pytests/basetestcase.py", line 529, in tearDown self.assertFalse(result, msg="Cb_log file validation failed") AssertionError: Cb_log file validation failed ---------------------------------------------------------------------- Ran 1 test in 1172.019s FAILED (errors=1) summary so far suite bucket_collections.collections_rebalance.CollectionsRebalance , pass 0 , fail 3 failures so far... bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_graceful_failover_recovery bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_graceful_failover_recovery bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_hard_failover_recovery testrunner logs, diags and results are available under /data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/logs/testrunner-21-Sep-14_02-20-11/test_3 Logs will be stored at /data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/logs/testrunner-21-Sep-14_02-20-11/test_4 guides/gradlew --refresh-dependencies testrunner -P jython=/opt/jython/bin/jython -P 'args=-i /tmp/testexec.38389.ini GROUP=P0_failover_and_recovery_dgm,rerun=False,get-cbcollect-info=True,infra_log_level=critical,log_level=error,bucket_storage=magma,enable_dp=True,upgrade_version=7.1.0-1277 -t bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_hard_failover_recovery,nodes_init=3,nodes_failover=1,recovery_type=delta,bucket_spec=dgm.buckets_for_rebalance_tests,data_load_stage=during,dgm=40,skip_validations=False,GROUP=P0_failover_and_recovery_dgm' Test Input params: {'data_load_stage': 'during', 'conf_file': 'conf/collections/collections_failover_dgm.conf', 'upgrade_version': '7.1.0-1277', 'dgm': '40', 'spec': 'collections_failover_dgm', 'rerun': 'False', 'num_nodes': 5, 'GROUP': 'P0_failover_and_recovery_dgm', 'enable_dp': 'True', 'bucket_spec': 'dgm.buckets_for_rebalance_tests', 'case_number': 4, 'cluster_name': 'testexec.38389', 'nodes_failover': '1', 'ini': '/tmp/testexec.38389.ini', 'get-cbcollect-info': 'True', 'recovery_type': 'delta', 'log_level': 'error', 'bucket_storage': 'magma', 'skip_validations': 'False', 'logs_folder': '/data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/logs/testrunner-21-Sep-14_02-20-11/test_4', 'nodes_init': '3', 'infra_log_level': 'critical'} test_data_load_collections_with_hard_failover_recovery (bucket_collections.collections_rebalance.CollectionsRebalance) ... Got this failure java.net.ConnectException: Connection refused: /172.23.123.119:8091 during connect (<_realsocket at 0x3d45a type=client open_count=1 channel=[id: 0xeb53c1a7, 0.0.0.0/0.0.0.0:45633] timeout=300.0>) 2021-09-14 02:49:42,280 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.123.119:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.123.119:8091 during connect (<_realsocket at 0x3d45b type=client open_count=1 channel=[id: 0xf8afa2a8, 0.0.0.0/0.0.0.0:49666] timeout=300.0>) 2021-09-14 02:49:44,293 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.123.119:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.123.119:8091 during connect (<_realsocket at 0x3d45c type=client open_count=1 channel=[id: 0x453d5da1, 0.0.0.0/0.0.0.0:33949] timeout=300.0>) 2021-09-14 02:49:46,303 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.123.119:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.123.119:8091 during connect (<_realsocket at 0x3d45d type=client open_count=1 channel=[id: 0xa4427b2e, 0.0.0.0/0.0.0.0:52391] timeout=300.0>) 2021-09-14 02:49:48,315 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.123.119:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.123.119:8091 during connect (<_realsocket at 0x3d45e type=client open_count=1 channel=[id: 0x15da8272, 0.0.0.0/0.0.0.0:44877] timeout=300.0>) 2021-09-14 02:49:50,326 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.123.119:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.222:8091 during connect (<_realsocket at 0x3d45f type=client open_count=1 channel=[id: 0x85eb4f17, 0.0.0.0/0.0.0.0:60860] timeout=300.0>) 2021-09-14 02:49:55,831 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.222:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.222:8091 during connect (<_realsocket at 0x3d460 type=client open_count=1 channel=[id: 0xc8fc02a3, 0.0.0.0/0.0.0.0:52525] timeout=300.0>) 2021-09-14 02:49:57,842 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.222:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.222:8091 during connect (<_realsocket at 0x3d461 type=client open_count=1 channel=[id: 0x3ed9f809, 0.0.0.0/0.0.0.0:36774] timeout=300.0>) 2021-09-14 02:49:59,852 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.222:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.222:8091 during connect (<_realsocket at 0x3d462 type=client open_count=1 channel=[id: 0xc83f98c0, 0.0.0.0/0.0.0.0:38645] timeout=300.0>) 2021-09-14 02:50:01,862 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.222:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.221:8091 during connect (<_realsocket at 0x3d463 type=client open_count=1 channel=[id: 0xfa50b64a, 0.0.0.0/0.0.0.0:40390] timeout=300.0>) 2021-09-14 02:50:07,398 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.221:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.221:8091 during connect (<_realsocket at 0x3d464 type=client open_count=1 channel=[id: 0x3afa077f, 0.0.0.0/0.0.0.0:55741] timeout=300.0>) 2021-09-14 02:50:09,411 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.221:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.221:8091 during connect (<_realsocket at 0x3d465 type=client open_count=1 channel=[id: 0xdbd958d2, 0.0.0.0/0.0.0.0:47146] timeout=300.0>) 2021-09-14 02:50:11,423 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.221:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.221:8091 during connect (<_realsocket at 0x3d466 type=client open_count=1 channel=[id: 0x0a89f434, 0.0.0.0/0.0.0.0:57725] timeout=300.0>) 2021-09-14 02:50:13,433 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.221:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.121.221:8091 during connect (<_realsocket at 0x3d467 type=client open_count=1 channel=[id: 0xae95898b, 0.0.0.0/0.0.0.0:59115] timeout=300.0>) 2021-09-14 02:50:15,444 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.121.221:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.8:8091 during connect (<_realsocket at 0x3d468 type=client open_count=1 channel=[id: 0xda978a74, 0.0.0.0/0.0.0.0:59945] timeout=300.0>) 2021-09-14 02:50:21,565 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.8:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.8:8091 during connect (<_realsocket at 0x3d469 type=client open_count=1 channel=[id: 0x41db5656, 0.0.0.0/0.0.0.0:39719] timeout=300.0>) 2021-09-14 02:50:23,576 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.8:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.8:8091 during connect (<_realsocket at 0x3d46a type=client open_count=1 channel=[id: 0x3bc0c5cf, 0.0.0.0/0.0.0.0:57883] timeout=300.0>) 2021-09-14 02:50:25,586 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.8:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.8:8091 during connect (<_realsocket at 0x3d46b type=client open_count=1 channel=[id: 0xc71527cc, 0.0.0.0/0.0.0.0:36521] timeout=300.0>) 2021-09-14 02:50:27,599 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.8:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.8:8091 during connect (<_realsocket at 0x3d46c type=client open_count=1 channel=[id: 0xce53127e, 0.0.0.0/0.0.0.0:57589] timeout=300.0>) 2021-09-14 02:50:29,611 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.8:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0x3d46d type=client open_count=1 channel=[id: 0x46fd0864, 0.0.0.0/0.0.0.0:38558] timeout=300.0>) 2021-09-14 02:50:37,184 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0x3d46e type=client open_count=1 channel=[id: 0x236bdf62, 0.0.0.0/0.0.0.0:41240] timeout=300.0>) 2021-09-14 02:50:39,194 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0x3d46f type=client open_count=1 channel=[id: 0x6136ba6e, 0.0.0.0/0.0.0.0:43178] timeout=300.0>) 2021-09-14 02:50:41,207 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0x3d470 type=client open_count=1 channel=[id: 0x9aa1abb2, 0.0.0.0/0.0.0.0:57317] timeout=300.0>) 2021-09-14 02:50:43,217 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0x3d471 type=client open_count=1 channel=[id: 0xac28a849, 0.0.0.0/0.0.0.0:59387] timeout=300.0>) 2021-09-14 02:50:45,229 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0x3d472 type=client open_count=1 channel=[id: 0xf862c38e, 0.0.0.0/0.0.0.0:41283] timeout=300.0>) 2021-09-14 02:50:47,240 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0x3d473 type=client open_count=1 channel=[id: 0x20a2608b, 0.0.0.0/0.0.0.0:39957] timeout=300.0>) 2021-09-14 02:50:49,250 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0x3d474 type=client open_count=1 channel=[id: 0x2413f43d, 0.0.0.0/0.0.0.0:41959] timeout=300.0>) 2021-09-14 02:50:51,260 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused Got this failure java.net.ConnectException: Connection refused: /172.23.106.45:8091 during connect (<_realsocket at 0x3d475 type=client open_count=1 channel=[id: 0xbcce5793, 0.0.0.0/0.0.0.0:38915] timeout=300.0>) 2021-09-14 02:50:53,272 | test | ERROR | MainThread | [rest_client:_http_request:826] Socket error while connecting to http://172.23.106.45:8091/nodes/self error [Errno 111] Connection refused ############################## 172.23.121.222: 1 core dump seen running: //opt/couchbase/bin/minidump-2-core /opt/couchbase/var/lib/couchbase/crash/53b2e9b9-7a98-45a5-ad5431aa-ef85228a.dmp > /opt/couchbase/var/lib/couchbase/crash/53b2e9b9-7a98-45a5-ad5431aa-ef85228a.core running: gdb --batch /opt/couchbase/bin/memcached -c /opt/couchbase/var/lib/couchbase/crash/53b2e9b9-7a98-45a5-ad5431aa-ef85228a.core -ex "bt full" -ex quit 172.23.121.222: Stack Trace of first crash - 53b2e9b9-7a98-45a5-ad5431aa-ef85228a.dmp Core was generated by `/opt/couchbase/bin/memcached -C /opt/couchbase/var/lib/couchbase/config/memcach'. #0 0x00007fa8df4f9337 in raise () from /lib64/libc.so.6 #0 0x00007fa8df4f9337 in raise () from /lib64/libc.so.6 No symbol table info available. #1 0x00007fa8df4faa28 in abort () from /lib64/libc.so.6 No symbol table info available. #2 0x00007fa8dfe4463c in __gnu_cxx::__verbose_terminate_handler () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/libsupc++/vterminate.cc:95 terminating = false t = #3 0x0000000000a99a3b in backtrace_terminate_handler() () No symbol table info available. #4 0x00007fa8dfe4f8f6 in __cxxabiv1::__terminate(void (*)()) () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/libsupc++/eh_terminate.cc:48 No locals. #5 0x00007fa8dfe4f961 in std::terminate () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/libsupc++/eh_terminate.cc:58 No locals. #6 0x00007fa8dfe4fc46 in __cxxabiv1::__cxa_rethrow () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/libsupc++/eh_throw.cc:133 globals = header = #7 0x00000000004c3efe in EPBucket::compactionCompletionCallback(CompactionContext&) [clone .cold] () No symbol table info available. #8 0x00000000008524f2 in MagmaKVStore::compactDBInternal(std::unique_lock&, std::shared_ptr) () No symbol table info available. #9 0x0000000000852f76 in MagmaKVStore::compactDB(std::unique_lock&, std::shared_ptr) () No symbol table info available. #10 0x00000000007eb3ba in EPBucket::compactInternal(LockedVBucketPtr&, CompactionConfig&) () No symbol table info available. #11 0x00000000007ec981 in EPBucket::doCompact(Vbid, CompactionConfig&, std::vector >&) () No symbol table info available. #12 0x0000000000706666 in CompactTask::run() () No symbol table info available. #13 0x0000000000a0ae52 in GlobalTask::execute() () No symbol table info available. #14 0x0000000000a07f75 in FollyExecutorPool::TaskProxy::scheduleViaCPUPool()::{lambda()#2}::operator()() const () No symbol table info available. #15 0x0000000000b59b30 in folly::ThreadPoolExecutor::runTask(std::shared_ptr const&, folly::ThreadPoolExecutor::Task&&) () No symbol table info available. #16 0x0000000000b418ea in folly::CPUThreadPoolExecutor::threadRun(std::shared_ptr) () No symbol table info available. #17 0x0000000000b5cae9 in void folly::detail::function::FunctionTraits::callBig))(std::shared_ptr)> >(folly::detail::function::Data&) () No symbol table info available. #18 0x0000000000a07c04 in void folly::detail::function::FunctionTraits::callBig&&)::{lambda()#1}>(folly::detail::function::Data&) () No symbol table info available. #19 0x00007fa8dfe78d40 in execute_native_thread_routine () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/thread.cc:80 No locals. #20 0x00007fa8e1c98e65 in start_thread () from /lib64/libpthread.so.0 No symbol table info available. #21 0x00007fa8df5c188d in clone () from /lib64/libc.so.6 No symbol table info available. ############################## running: gdb -p `(pidof memcached)` -ex "thread apply all bt" -ex detach -ex quit [Thread debugging using libthread_db enabled] Using host libthread_db library "/lib64/libthread_db.so.1". Loaded symbols for /lib64/libpthread.so.0 Reading symbols from /lib64/librt.so.1...(no debugging symbols found)...done. Loaded symbols for /lib64/librt.so.1 Reading symbols from /opt/couchbase/bin/../lib/liblz4.so.1...Reading symbols from /opt/couchbase/bin/../lib/liblz4.so.1...(no debugging symbols found)...done. (no debugging symbols found)...done. Loaded symbols for /opt/couchbase/bin/../lib/liblz4.so.1 Reading symbols from /lib64/libsnappy.so.1...Reading symbols from /lib64/libsnappy.so.1...(no debugging symbols found)...done. (no debugging symbols found)...done. Loaded symbols for /lib64/libsnappy.so.1 Reading symbols from /opt/couchbase/bin/../lib/libjemalloc.so.2...Reading symbols from /opt/couchbase/bin/../lib/libjemalloc.so.2...(no debugging symbols found)...done. (no debugging symbols found)...done. Loaded symbols for /opt/couchbase/bin/../lib/libjemalloc.so.2 Reading symbols from /opt/couchbase/bin/../lib/libevent_core-2.1.so.7...Reading symbols from /opt/couchbase/bin/../lib/libevent_core-2.1.so.7...(no debugging symbols found)...done. (no debugging symbols found)...done. Loaded symbols for /opt/couchbase/bin/../lib/libevent_core-2.1.so.7 Reading symbols from /opt/couchbase/bin/../lib/libevent_extra-2.1.so.7...Reading symbols from /opt/couchbase/bin/../lib/libevent_extra-2.1.so.7...(no debugging symbols found)...done. (no debugging symbols found)...done. Loaded symbols for /opt/couchbase/bin/../lib/libevent_extra-2.1.so.7 Reading symbols from /opt/couchbase/bin/../lib/libevent_pthreads-2.1.so.7...Reading symbols from /opt/couchbase/bin/../lib/libevent_pthreads-2.1.so.7...(no debugging symbols found)...done. (no debugging symbols found)...done. Loaded symbols for /opt/couchbase/bin/../lib/libevent_pthreads-2.1.so.7 Reading symbols from /opt/couchbase/bin/../lib/libevent_openssl-2.1.so.7...Reading symbols from /opt/couchbase/bin/../lib/libevent_openssl-2.1.so.7...(no debugging symbols found)...done. (no debugging symbols found)...done. Loaded symbols for /opt/couchbase/bin/../lib/libevent_openssl-2.1.so.7 Reading symbols from /opt/couchbase/bin/../lib/libssl.so.1.1...Reading symbols from /opt/couchbase/bin/../lib/libssl.so.1.1...(no debugging symbols found)...done. (no debugging symbols found)...done. Loaded symbols for /opt/couchbase/bin/../lib/libssl.so.1.1 Reading symbols from /opt/couchbase/bin/../lib/libcrypto.so.1.1...Reading symbols from /opt/couchbase/bin/../lib/libcrypto.so.1.1...(no debugging symbols found)...done. (no debugging symbols found)...done. Loaded symbols for /opt/couchbase/bin/../lib/libcrypto.so.1.1 Reading symbols from /opt/couchbase/bin/../lib/libstdc++.so.6...done. Loaded symbols for /opt/couchbase/bin/../lib/libstdc++.so.6 Reading symbols from /lib64/libm.so.6...(no debugging symbols found)...done. Loaded symbols for /lib64/libm.so.6 Reading symbols from /opt/couchbase/bin/../lib/libgcc_s.so.1...done. Loaded symbols for /opt/couchbase/bin/../lib/libgcc_s.so.1 Reading symbols from /lib64/libc.so.6...(no debugging symbols found)...done. Loaded symbols for /lib64/libc.so.6 Reading symbols from /lib64/ld-linux-x86-64.so.2...(no debugging symbols found)...done. Loaded symbols for /lib64/ld-linux-x86-64.so.2 0x00007f18a9d68e63 in epoll_wait () from /lib64/libc.so.6 To enable execution of this file add add-auto-load-safe-path /opt/couchbase/lib/libstdc++.so.6.0.28-gdb.py line to your configuration file "/root/.gdbinit". To completely disable this security protection add set auto-load safe-path / line to your configuration file "/root/.gdbinit". For more information about this security protection see the "Auto-loading safe path" section in the GDB manual. E.g., run from the shell: info "(gdb)Auto-loading safe path" Thread 47 (Thread 0x7f18a93ff700 (LWP 41832)): #0 0x00007f18ac443da2 in pthread_cond_timedwait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f18ab9495a7 in background_thread_entry () from /opt/couchbase/bin/../lib/libjemalloc.so.2 #2 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #3 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 46 (Thread 0x7f18a85fe700 (LWP 41833)): #0 0x00007f18ac443da2 in pthread_cond_timedwait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x0000000000ac40aa in spdlog::details::thread_pool::process_next_msg_() () #2 0x0000000000ac43f8 in _ZNSt6thread11_State_implINS_8_InvokerISt5tupleIJZN6spdlog7details11thread_poolC4EmmSt8functionIFvvEEEUlvE_EEEEE6_M_runEv () #3 0x00007f18aa61fd40 in execute_native_thread_routine () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/thread.cc:80 #4 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #5 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 45 (Thread 0x7f18a7dfd700 (LWP 41834)): #0 0x00007f18ac443da2 in pthread_cond_timedwait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x0000000000ac3e78 in _ZNSt6thread11_State_implINS_8_InvokerISt5tupleIJZN6spdlog7details15periodic_workerC4ERKSt8functionIFvvEENSt6chrono8durationIlSt5ratioILl1ELl1EEEEEUlvE_EEEEE6_M_runEv () #2 0x00007f18aa61fd40 in execute_native_thread_routine () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/thread.cc:80 #3 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #4 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 44 (Thread 0x7f18a75fc700 (LWP 41835)): #0 0x00007f18a9d5dbed in poll () from /lib64/libc.so.6 #1 0x000000000059e201 in check_stdin_thread(void*) () #2 0x0000000000af6fe9 in platform_thread_wrap(void*) () #3 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #4 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 43 (Thread 0x7f18a55f8700 (LWP 41839)): #0 0x00007f18ac443da2 in pthread_cond_timedwait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00000000005e4906 in ExternalAuthManagerThread::run() () #2 0x0000000000b044f3 in Couchbase::Thread::thread_entry() () #3 0x0000000000af6fe9 in platform_thread_wrap(void*) () #4 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #5 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 42 (Thread 0x7f18a49f7700 (LWP 41840)): #0 0x00007f18ac443da2 in pthread_cond_timedwait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x000000000066c827 in AuditImpl::consume_events() () #2 0x0000000000af6fe9 in platform_thread_wrap(void*) () #3 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #4 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 41 (Thread 0x7f18a5df9700 (LWP 41841)): #0 0x00007f18ac4439f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x0000000000a3899b in worker_thread () #2 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #3 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 40 (Thread 0x7f18a65fa700 (LWP 41842)): #0 0x00007f18ac4439f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x0000000000a3899b in worker_thread () #2 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #3 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 39 (Thread 0x7f18a6dfb700 (LWP 41843)): #0 0x00007f18a9d5dbed in poll () from /lib64/libc.so.6 #1 0x0000000000a2f908 in master_thread () #2 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #3 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 38 (Thread 0x7f189ffff700 (LWP 41844)): #0 0x00007f18a9d68e63 in epoll_wait () from /lib64/libc.so.6 #1 0x00007f18ab7103c5 in epoll_dispatch () from /opt/couchbase/bin/../lib/libevent_core-2.1.so.7 #2 0x00007f18ab707376 in event_base_loop () from /opt/couchbase/bin/../lib/libevent_core-2.1.so.7 #3 0x0000000000b63217 in folly::EventBase::loopBody(int, bool) () #4 0x0000000000b636e6 in folly::EventBase::loop() () #5 0x0000000000b650a6 in folly::EventBase::loopForever() () #6 0x00000000005a3549 in worker_libevent(void*) () #7 0x0000000000af6fe9 in platform_thread_wrap(void*) () #8 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #9 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 37 (Thread 0x7f189f7fe700 (LWP 41845)): #0 0x00007f18a9d68e63 in epoll_wait () from /lib64/libc.so.6 #1 0x00007f18ab7103c5 in epoll_dispatch () from /opt/couchbase/bin/../lib/libevent_core-2.1.so.7 #2 0x00007f18ab707376 in event_base_loop () from /opt/couchbase/bin/../lib/libevent_core-2.1.so.7 #3 0x0000000000b63217 in folly::EventBase::loopBody(int, bool) () #4 0x0000000000b636e6 in folly::EventBase::loop() () #5 0x0000000000b650a6 in folly::EventBase::loopForever() () #6 0x00000000005a3549 in worker_libevent(void*) () #7 0x0000000000af6fe9 in platform_thread_wrap(void*) () #8 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #9 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 36 (Thread 0x7f189effd700 (LWP 41846)): #0 0x00007f18a9d68e63 in epoll_wait () from /lib64/libc.so.6 #1 0x00007f18ab7103c5 in epoll_dispatch () from /opt/couchbase/bin/../lib/libevent_core-2.1.so.7 #2 0x00007f18ab707376 in event_base_loop () from /opt/couchbase/bin/../lib/libevent_core-2.1.so.7 #3 0x0000000000b63217 in folly::EventBase::loopBody(int, bool) () #4 0x0000000000b636e6 in folly::EventBase::loop() () #5 0x0000000000b650a6 in folly::EventBase::loopForever() () #6 0x00000000005a3549 in worker_libevent(void*) () #7 0x0000000000af6fe9 in platform_thread_wrap(void*) () #8 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #9 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 35 (Thread 0x7f189e7fc700 (LWP 41847)): #0 0x00007f18a9d68e63 in epoll_wait () from /lib64/libc.so.6 #1 0x00007f18ab7103c5 in epoll_dispatch () from /opt/couchbase/bin/../lib/libevent_core-2.1.so.7 #2 0x00007f18ab707376 in event_base_loop () from /opt/couchbase/bin/../lib/libevent_core-2.1.so.7 #3 0x0000000000b63217 in folly::EventBase::loopBody(int, bool) () #4 0x0000000000b636e6 in folly::EventBase::loop() () #5 0x0000000000b650a6 in folly::EventBase::loopForever() () #6 0x00000000005a3549 in worker_libevent(void*) () #7 0x0000000000af6fe9 in platform_thread_wrap(void*) () #8 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #9 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 34 (Thread 0x7f1897fff700 (LWP 41848)): #0 0x00007f18a9d62ba9 in syscall () from /lib64/libc.so.6 #1 0x0000000000b3960d in folly::detail::futexWaitImpl(std::atomic const*, unsigned int, std::chrono::time_point > > const*, std::chrono::time_point > > const*, unsigned int) () #2 0x0000000000b436da in folly::detail::FutexResult folly::detail::futexWaitUntil, std::chrono::_V2::steady_clock, std::chrono::duration > >(std::atomic const*, unsigned int, std::chrono::time_point > > const&, unsigned int) () #3 0x0000000000b4953d in bool folly::SaturatingSemaphore::tryWaitSlow > >(std::chrono::time_point > > const&, folly::WaitOptions const&) () #4 0x0000000000b49ba6 in bool folly::detail::LifoSemBase, std::atomic>::try_wait_until > >(std::chrono::time_point > > const&) () #5 0x0000000000b4d300 in folly::UnboundedBlockingQueue::try_take_for(std::chrono::duration >) () #6 0x0000000000b41877 in folly::CPUThreadPoolExecutor::threadRun(std::shared_ptr) () #7 0x0000000000b5cae9 in void folly::detail::function::FunctionTraits::callBig))(std::shared_ptr)> >(folly::detail::function::Data&) () #8 0x0000000000a07c04 in void folly::detail::function::FunctionTraits::callBig&&)::{lambda()#1}>(folly::detail::function::Data&) () #9 0x00007f18aa61fd40 in execute_native_thread_routine () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/thread.cc:80 #10 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #11 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 33 (Thread 0x7f18977fe700 (LWP 41849)): #0 0x00007f18a9d62ba9 in syscall () from /lib64/libc.so.6 #1 0x0000000000b3960d in folly::detail::futexWaitImpl(std::atomic const*, unsigned int, std::chrono::time_point > > const*, std::chrono::time_point > > const*, unsigned int) () #2 0x0000000000b436da in folly::detail::FutexResult folly::detail::futexWaitUntil, std::chrono::_V2::steady_clock, std::chrono::duration > >(std::atomic const*, unsigned int, std::chrono::time_point > > const&, unsigned int) () #3 0x0000000000b4953d in bool folly::SaturatingSemaphore::tryWaitSlow > >(std::chrono::time_point > > const&, folly::WaitOptions const&) () #4 0x0000000000b49ba6 in bool folly::detail::LifoSemBase, std::atomic>::try_wait_until > >(std::chrono::time_point > > const&) () #5 0x0000000000b4d300 in folly::UnboundedBlockingQueue::try_take_for(std::chrono::duration >) () #6 0x0000000000b41877 in folly::CPUThreadPoolExecutor::threadRun(std::shared_ptr) () #7 0x0000000000b5cae9 in void folly::detail::function::FunctionTraits::callBig))(std::shared_ptr)> >(folly::detail::function::Data&) () #8 0x0000000000a07c04 in void folly::detail::function::FunctionTraits::callBig&&)::{lambda()#1}>(folly::detail::function::Data&) () #9 0x00007f18aa61fd40 in execute_native_thread_routine () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/thread.cc:80 #10 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #11 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 32 (Thread 0x7f1896ffd700 (LWP 41850)): #0 0x00007f18a9d62ba9 in syscall () from /lib64/libc.so.6 #1 0x0000000000b3960d in folly::detail::futexWaitImpl(std::atomic const*, unsigned int, std::chrono::time_point > > const*, std::chrono::time_point > > const*, unsigned int) () #2 0x0000000000b436da in folly::detail::FutexResult folly::detail::futexWaitUntil, std::chrono::_V2::steady_clock, std::chrono::duration > >(std::atomic const*, unsigned int, std::chrono::time_point > > const&, unsigned int) () #3 0x0000000000b49677 in bool folly::SaturatingSemaphore::tryWaitSlow > >(std::chrono::time_point > > const&, folly::WaitOptions const&) () #4 0x0000000000b49ba6 in bool folly::detail::LifoSemBase, std::atomic>::try_wait_until > >(std::chrono::time_point > > const&) () #5 0x0000000000b4d300 in folly::UnboundedBlockingQueue::try_take_for(std::chrono::duration >) () #6 0x0000000000b41877 in folly::CPUThreadPoolExecutor::threadRun(std::shared_ptr) () #7 0x0000000000b5cae9 in void folly::detail::function::FunctionTraits::callBig))(std::shared_ptr)> >(folly::detail::function::Data&) () #8 0x0000000000a07c04 in void folly::detail::function::FunctionTraits::callBig&&)::{lambda()#1}>(folly::detail::function::Data&) () #9 0x00007f18aa61fd40 in execute_native_thread_routine () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/thread.cc:80 #10 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #11 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 31 (Thread 0x7f1894fd7700 (LWP 41851)): #0 0x00007f18a9d62ba9 in syscall () from /lib64/libc.so.6 #1 0x0000000000b3960d in folly::detail::futexWaitImpl(std::atomic const*, unsigned int, std::chrono::time_point > > const*, std::chrono::time_point > > const*, unsigned int) () #2 0x0000000000b436da in folly::detail::FutexResult folly::detail::futexWaitUntil, std::chrono::_V2::steady_clock, std::chrono::duration > >(std::atomic const*, unsigned int, std::chrono::time_point > > const&, unsigned int) () #3 0x0000000000b49677 in bool folly::SaturatingSemaphore::tryWaitSlow > >(std::chrono::time_point > > const&, folly::WaitOptions const&) () #4 0x0000000000b49ba6 in bool folly::detail::LifoSemBase, std::atomic>::try_wait_until > >(std::chrono::time_point > > const&) () #5 0x0000000000b4d300 in folly::UnboundedBlockingQueue::try_take_for(std::chrono::duration >) () #6 0x0000000000b41877 in folly::CPUThreadPoolExecutor::threadRun(std::shared_ptr) () #7 0x0000000000b5cae9 in void folly::detail::function::FunctionTraits::callBig))(std::shared_ptr)> >(folly::detail::function::Data&) () #8 0x0000000000a07c04 in void folly::detail::function::FunctionTraits::callBig&&)::{lambda()#1}>(folly::detail::function::Data&) () #9 0x00007f18aa61fd40 in execute_native_thread_routine () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/thread.cc:80 #10 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #11 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 30 (Thread 0x7f1883fff700 (LWP 41852)): #0 0x00007f18a9d62ba9 in syscall () from /lib64/libc.so.6 #1 0x0000000000b3960d in folly::detail::futexWaitImpl(std::atomic const*, unsigned int, std::chrono::time_point > > const*, std::chrono::time_point > > const*, unsigned int) () #2 0x0000000000b436da in folly::detail::FutexResult folly::detail::futexWaitUntil, std::chrono::_V2::steady_clock, std::chrono::duration > >(std::atomic const*, unsigned int, std::chrono::time_point > > const&, unsigned int) () #3 0x0000000000b4953d in bool folly::SaturatingSemaphore::tryWaitSlow > >(std::chrono::time_point > > const&, folly::WaitOptions const&) () #4 0x0000000000b49ba6 in bool folly::detail::LifoSemBase, std::atomic>::try_wait_until > >(std::chrono::time_point > > const&) () #5 0x0000000000b4d300 in folly::UnboundedBlockingQueue::try_take_for(std::chrono::duration >) () #6 0x0000000000b41877 in folly::CPUThreadPoolExecutor::threadRun(std::shared_ptr) () #7 0x0000000000b5cae9 in void folly::detail::function::FunctionTraits::callBig))(std::shared_ptr)> >(folly::detail::function::Data&) () #8 0x0000000000a07c04 in void folly::detail::function::FunctionTraits::callBig&&)::{lambda()#1}>(folly::detail::function::Data&) () #9 0x00007f18aa61fd40 in execute_native_thread_routine () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/thread.cc:80 #10 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #11 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 29 (Thread 0x7f18837fe700 (LWP 41853)): #0 0x00007f18a9d62ba9 in syscall () from /lib64/libc.so.6 #1 0x0000000000b3960d in folly::detail::futexWaitImpl(std::atomic const*, unsigned int, std::chrono::time_point > > const*, std::chrono::time_point > > const*, unsigned int) () #2 0x0000000000b436da in folly::detail::FutexResult folly::detail::futexWaitUntil, std::chrono::_V2::steady_clock, std::chrono::duration > >(std::atomic const*, unsigned int, std::chrono::time_point > > const&, unsigned int) () #3 0x0000000000b4953d in bool folly::SaturatingSemaphore::tryWaitSlow > >(std::chrono::time_point > > const&, folly::WaitOptions const&) () #4 0x0000000000b49ba6 in bool folly::detail::LifoSemBase, std::atomic>::try_wait_until > >(std::chrono::time_point > > const&) () #5 0x0000000000b4d300 in folly::UnboundedBlockingQueue::try_take_for(std::chrono::duration >) () #6 0x0000000000b41877 in folly::CPUThreadPoolExecutor::threadRun(std::shared_ptr) () #7 0x0000000000b5cae9 in void folly::detail::function::FunctionTraits::callBig))(std::shared_ptr)> >(folly::detail::function::Data&) () #8 0x0000000000a07c04 in void folly::detail::function::FunctionTraits::callBig&&)::{lambda()#1}>(folly::detail::function::Data&) () #9 0x00007f18aa61fd40 in execute_native_thread_routine () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/thread.cc:80 #10 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #11 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 28 (Thread 0x7f1882ffd700 (LWP 41854)): #0 0x00007f18a9d62ba9 in syscall () from /lib64/libc.so.6 #1 0x0000000000b3960d in folly::detail::futexWaitImpl(std::atomic const*, unsigned int, std::chrono::time_point > > const*, std::chrono::time_point > > const*, unsigned int) () #2 0x0000000000b436da in folly::detail::FutexResult folly::detail::futexWaitUntil, std::chrono::_V2::steady_clock, std::chrono::duration > >(std::atomic const*, unsigned int, std::chrono::time_point > > const&, unsigned int) () #3 0x0000000000b4953d in bool folly::SaturatingSemaphore::tryWaitSlow > >(std::chrono::time_point > > const&, folly::WaitOptions const&) () #4 0x0000000000b49ba6 in bool folly::detail::LifoSemBase, std::atomic>::try_wait_until > >(std::chrono::time_point > > const&) () #5 0x0000000000b4d300 in folly::UnboundedBlockingQueue::try_take_for(std::chrono::duration >) () #6 0x0000000000b41877 in folly::CPUThreadPoolExecutor::threadRun(std::shared_ptr) () #7 0x0000000000b5cae9 in void folly::detail::function::FunctionTraits::callBig))(std::shared_ptr)> >(folly::detail::function::Data&) () #8 0x0000000000a07c04 in void folly::detail::function::FunctionTraits::callBig&&)::{lambda()#1}>(folly::detail::function::Data&) () #9 0x00007f18aa61fd40 in execute_native_thread_routine () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/thread.cc:80 #10 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #11 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 27 (Thread 0x7f18827fc700 (LWP 41855)): #0 0x00007f18a9d62ba9 in syscall () from /lib64/libc.so.6 #1 0x0000000000b3960d in folly::detail::futexWaitImpl(std::atomic const*, unsigned int, std::chrono::time_point > > const*, std::chrono::time_point > > const*, unsigned int) () #2 0x0000000000b436da in folly::detail::FutexResult folly::detail::futexWaitUntil, std::chrono::_V2::steady_clock, std::chrono::duration > >(std::atomic const*, unsigned int, std::chrono::time_point > > const&, unsigned int) () #3 0x0000000000b4953d in bool folly::SaturatingSemaphore::tryWaitSlow > >(std::chrono::time_point > > const&, folly::WaitOptions const&) () #4 0x0000000000b49ba6 in bool folly::detail::LifoSemBase, std::atomic>::try_wait_until > >(std::chrono::time_point > > const&) () #5 0x0000000000b4d300 in folly::UnboundedBlockingQueue::try_take_for(std::chrono::duration >) () #6 0x0000000000b41877 in folly::CPUThreadPoolExecutor::threadRun(std::shared_ptr) () #7 0x0000000000b5cae9 in void folly::detail::function::FunctionTraits::callBig))(std::shared_ptr)> >(folly::detail::function::Data&) () #8 0x0000000000a07c04 in void folly::detail::function::FunctionTraits::callBig&&)::{lambda()#1}>(folly::detail::function::Data&) () #9 0x00007f18aa61fd40 in execute_native_thread_routine () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/thread.cc:80 #10 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #11 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 26 (Thread 0x7f1881ffb700 (LWP 41856)): #0 0x00007f18a9d62ba9 in syscall () from /lib64/libc.so.6 #1 0x0000000000b3960d in folly::detail::futexWaitImpl(std::atomic const*, unsigned int, std::chrono::time_point > > const*, std::chrono::time_point > > const*, unsigned int) () #2 0x0000000000b436da in folly::detail::FutexResult folly::detail::futexWaitUntil, std::chrono::_V2::steady_clock, std::chrono::duration > >(std::atomic const*, unsigned int, std::chrono::time_point > > const&, unsigned int) () #3 0x0000000000b4953d in bool folly::SaturatingSemaphore::tryWaitSlow > >(std::chrono::time_point > > const&, folly::WaitOptions const&) () #4 0x0000000000b49ba6 in bool folly::detail::LifoSemBase, std::atomic>::try_wait_until > >(std::chrono::time_point > > const&) () #5 0x0000000000b4d300 in folly::UnboundedBlockingQueue::try_take_for(std::chrono::duration >) () #6 0x0000000000b41877 in folly::CPUThreadPoolExecutor::threadRun(std::shared_ptr) () #7 0x0000000000b5cae9 in void folly::detail::function::FunctionTraits::callBig))(std::shared_ptr)> >(folly::detail::function::Data&) () #8 0x0000000000a07c04 in void folly::detail::function::FunctionTraits::callBig&&)::{lambda()#1}>(folly::detail::function::Data&) () #9 0x00007f18aa61fd40 in execute_native_thread_routine () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/thread.cc:80 #10 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #11 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 25 (Thread 0x7f18817fa700 (LWP 41857)): #0 0x00007f18a9d62ba9 in syscall () from /lib64/libc.so.6 #1 0x0000000000b3960d in folly::detail::futexWaitImpl(std::atomic const*, unsigned int, std::chrono::time_point > > const*, std::chrono::time_point > > const*, unsigned int) () #2 0x0000000000b436da in folly::detail::FutexResult folly::detail::futexWaitUntil, std::chrono::_V2::steady_clock, std::chrono::duration > >(std::atomic const*, unsigned int, std::chrono::time_point > > const&, unsigned int) () #3 0x0000000000b49677 in bool folly::SaturatingSemaphore::tryWaitSlow > >(std::chrono::time_point > > const&, folly::WaitOptions const&) () #4 0x0000000000b49ba6 in bool folly::detail::LifoSemBase, std::atomic>::try_wait_until > >(std::chrono::time_point > > const&) () #5 0x0000000000b4d300 in folly::UnboundedBlockingQueue::try_take_for(std::chrono::duration >) () #6 0x0000000000b41877 in folly::CPUThreadPoolExecutor::threadRun(std::shared_ptr) () #7 0x0000000000b5cae9 in void folly::detail::function::FunctionTraits::callBig))(std::shared_ptr)> >(folly::detail::function::Data&) () #8 0x0000000000a07c04 in void folly::detail::function::FunctionTraits::callBig&&)::{lambda()#1}>(folly::detail::function::Data&) () #9 0x00007f18aa61fd40 in execute_native_thread_routine () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/thread.cc:80 #10 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #11 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 24 (Thread 0x7f1880ff9700 (LWP 41858)): #0 0x00007f18ac4439f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f18aa61a8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x000000000092e08c in magma::WaitGroup::Wait() () #4 0x00000000009b1bfd in magma::SSTableLoader::Load() () #5 0x00000000009aac2c in magma::TreeSnapshot::decode(std::__cxx11::basic_string, std::allocator > const&, magma::TableLoader&, std::function > ()>) () #6 0x00000000009ab846 in magma::TreeSnapshot::Read(unsigned long, magma::TableLoader&, std::function > ()>) () #7 0x00000000009917f0 in magma::LSMTree::loadCheckpoints(std::vector > const&, std::vector >&) () #8 0x0000000000982ce9 in magma::LSMTree::Open(std::vector > const&) () #9 0x0000000000923b55 in magma::KVStore::Open() () #10 0x00000000008c87ef in magma::Magma::Impl::createKVStore(unsigned short, unsigned int, std::shared_ptr&)::{lambda()#4}::operator()() const () #11 0x00000000008c8ced in std::_Function_handler > (), magma::Magma::Impl::createKVStore(unsigned short, unsigned int, std::shared_ptr&)::{lambda()#4}>::_M_invoke(std::_Any_data const&) () #12 0x00000000008fb1e3 in magma::KVStoreSet::CreateKVStore(std::function > ()>) () #13 0x00000000008ce541 in magma::Magma::Impl::createKVStore(unsigned short, unsigned int, std::shared_ptr&) () #14 0x0000000000926ef3 in magma::Magma::Impl::recovery() () #15 0x00000000008c769f in magma::Magma::Impl::Open() () #16 0x00000000008c784d in magma::Magma::Open() () #17 0x0000000000858cd1 in MagmaMemoryTrackingProxy::Open() () #18 0x000000000084e691 in MagmaKVStore::MagmaKVStore(MagmaKVStoreConfig&) () #19 0x00000000007ffc4b in KVStoreFactory::create(KVStoreConfig&) () #20 0x00000000006f13cb in KVShard::KVShard(EventuallyPersistentEngine&, unsigned short) () #21 0x0000000000720b3f in VBucketMap::VBucketMap(Configuration&, KVBucket&) () #22 0x00000000006e44fb in KVBucket::KVBucket(EventuallyPersistentEngine&) () #23 0x00000000007e92f6 in EPBucket::EPBucket(EventuallyPersistentEngine&) () #24 0x0000000000697188 in EventuallyPersistentEngine::makeBucket(Configuration&) () #25 0x000000000069c735 in EventuallyPersistentEngine::initialize(char const*) () #26 0x00000000005b4635 in BucketManager::create(Cookie&, std::__cxx11::basic_string, std::allocator >, std::__cxx11::basic_string, std::allocator >, BucketType) () #27 0x000000000061e60e in std::_Function_handler::_M_invoke(std::_Any_data const&) () #28 0x000000000061cbc4 in OneShotTask::run() () #29 0x0000000000a0ae52 in GlobalTask::execute() () #30 0x0000000000a07f75 in FollyExecutorPool::TaskProxy::scheduleViaCPUPool()::{lambda()#2}::operator()() const () #31 0x0000000000b59b30 in folly::ThreadPoolExecutor::runTask(std::shared_ptr const&, folly::ThreadPoolExecutor::Task&&) () #32 0x0000000000b418ea in folly::CPUThreadPoolExecutor::threadRun(std::shared_ptr) () #33 0x0000000000b5cae9 in void folly::detail::function::FunctionTraits::callBig))(std::shared_ptr)> >(folly::detail::function::Data&) () #34 0x0000000000a07c04 in void folly::detail::function::FunctionTraits::callBig&&)::{lambda()#1}>(folly::detail::function::Data&) () #35 0x00007f18aa61fd40 in execute_native_thread_routine () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/thread.cc:80 #36 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #37 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 23 (Thread 0x7f1863fff700 (LWP 41859)): #0 0x00007f18a9d68e63 in epoll_wait () from /lib64/libc.so.6 #1 0x00007f18ab7103c5 in epoll_dispatch () from /opt/couchbase/bin/../lib/libevent_core-2.1.so.7 #2 0x00007f18ab707376 in event_base_loop () from /opt/couchbase/bin/../lib/libevent_core-2.1.so.7 #3 0x0000000000b63217 in folly::EventBase::loopBody(int, bool) () #4 0x0000000000b636e6 in folly::EventBase::loop() () #5 0x0000000000b650a6 in folly::EventBase::loopForever() () #6 0x0000000000b51359 in folly::IOThreadPoolExecutor::threadRun(std::shared_ptr) () #7 0x0000000000b5cae9 in void folly::detail::function::FunctionTraits::callBig))(std::shared_ptr)> >(folly::detail::function::Data&) () #8 0x00007f18aa61fd40 in execute_native_thread_routine () at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/thread.cc:80 #9 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #10 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 22 (Thread 0x7f18637fe700 (LWP 41860)): #0 0x00007f18ac443da2 in pthread_cond_timedwait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f18ab948e51 in background_thread_entry () from /opt/couchbase/bin/../lib/libjemalloc.so.2 #2 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #3 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 21 (Thread 0x7f1862ffd700 (LWP 41861)): #0 0x00007f18ac443da2 in pthread_cond_timedwait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x000000000094784f in magma::TaskQueue::dequeue(std::atomic&) () #2 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #3 0x0000000000af6fe9 in platform_thread_wrap(void*) () #4 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #5 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 20 (Thread 0x7f18627fc700 (LWP 41862)): #0 0x00007f18ac443da2 in pthread_cond_timedwait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x000000000094784f in magma::TaskQueue::dequeue(std::atomic&) () #2 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #3 0x0000000000af6fe9 in platform_thread_wrap(void*) () #4 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #5 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 19 (Thread 0x7f1861ffb700 (LWP 41863)): #0 0x00007f18ac443da2 in pthread_cond_timedwait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x000000000094784f in magma::TaskQueue::dequeue(std::atomic&) () #2 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #3 0x0000000000af6fe9 in platform_thread_wrap(void*) () #4 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #5 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 18 (Thread 0x7f18617fa700 (LWP 41864)): #0 0x00007f18ac443da2 in pthread_cond_timedwait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x000000000094784f in magma::TaskQueue::dequeue(std::atomic&) () #2 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #3 0x0000000000af6fe9 in platform_thread_wrap(void*) () #4 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #5 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 17 (Thread 0x7f1860ff9700 (LWP 41865)): #0 0x00007f18ac4439f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f18aa61a8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x00000000009475b8 in magma::TaskQueue::dequeue(std::atomic&) () #4 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #5 0x0000000000af6fe9 in platform_thread_wrap(void*) () #6 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #7 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 16 (Thread 0x7f1847fff700 (LWP 41866)): #0 0x00007f18ac4439f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f18aa61a8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x00000000009475b8 in magma::TaskQueue::dequeue(std::atomic&) () #4 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #5 0x0000000000af6fe9 in platform_thread_wrap(void*) () #6 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #7 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 15 (Thread 0x7f18477fe700 (LWP 41867)): #0 0x00007f18ac4439f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f18aa61a8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x00000000009475b8 in magma::TaskQueue::dequeue(std::atomic&) () #4 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #5 0x0000000000af6fe9 in platform_thread_wrap(void*) () #6 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #7 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 14 (Thread 0x7f1846ffd700 (LWP 41868)): #0 0x00007f18ac4439f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f18aa61a8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x00000000009475b8 in magma::TaskQueue::dequeue(std::atomic&) () #4 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #5 0x0000000000af6fe9 in platform_thread_wrap(void*) () #6 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #7 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 13 (Thread 0x7f1844eff700 (LWP 41873)): #0 0x00007f18ac4439f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f18aa61a8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x00000000009475b8 in magma::TaskQueue::dequeue(std::atomic&) () #4 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #5 0x0000000000af6fe9 in platform_thread_wrap(void*) () #6 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #7 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 12 (Thread 0x7f1843eef700 (LWP 41874)): #0 0x00007f18ac4439f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f18aa61a8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x00000000009475b8 in magma::TaskQueue::dequeue(std::atomic&) () #4 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #5 0x0000000000af6fe9 in platform_thread_wrap(void*) () #6 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #7 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 11 (Thread 0x7f18436ee700 (LWP 41875)): #0 0x00007f18ac4439f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f18aa61a8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x00000000009475b8 in magma::TaskQueue::dequeue(std::atomic&) () #4 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #5 0x0000000000af6fe9 in platform_thread_wrap(void*) () #6 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #7 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 10 (Thread 0x7f1842eed700 (LWP 41876)): #0 0x00007f18ac4439f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f18aa61a8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x00000000009475b8 in magma::TaskQueue::dequeue(std::atomic&) () #4 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #5 0x0000000000af6fe9 in platform_thread_wrap(void*) () #6 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #7 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 9 (Thread 0x7f18426ec700 (LWP 41877)): #0 0x00007f18ac4439f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f18aa61a8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x00000000009475b8 in magma::TaskQueue::dequeue(std::atomic&) () #4 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #5 0x0000000000af6fe9 in platform_thread_wrap(void*) () #6 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #7 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 8 (Thread 0x7f1841eeb700 (LWP 41878)): #0 0x00007f18ac4439f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f18aa61a8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x00000000009475b8 in magma::TaskQueue::dequeue(std::atomic&) () #4 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #5 0x0000000000af6fe9 in platform_thread_wrap(void*) () #6 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #7 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 7 (Thread 0x7f18416ea700 (LWP 41879)): #0 0x00007f18ac4439f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f18aa61a8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x00000000009475b8 in magma::TaskQueue::dequeue(std::atomic&) () #4 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #5 0x0000000000af6fe9 in platform_thread_wrap(void*) () #6 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #7 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 6 (Thread 0x7f1840ee9700 (LWP 41880)): #0 0x00007f18ac4439f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f18aa61a8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x00000000009475b8 in magma::TaskQueue::dequeue(std::atomic&) () #4 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #5 0x0000000000af6fe9 in platform_thread_wrap(void*) () #6 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #7 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 5 (Thread 0x7f18406e8700 (LWP 41881)): #0 0x00007f18ac4439f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f18aa61a8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x00000000009475b8 in magma::TaskQueue::dequeue(std::atomic&) () #4 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #5 0x0000000000af6fe9 in platform_thread_wrap(void*) () #6 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #7 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 4 (Thread 0x7f183fee7700 (LWP 41882)): #0 0x00007f18ac4439f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f18aa61a8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x00000000009475b8 in magma::TaskQueue::dequeue(std::atomic&) () #4 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #5 0x0000000000af6fe9 in platform_thread_wrap(void*) () #6 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #7 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 3 (Thread 0x7f183f6e6700 (LWP 41883)): #0 0x00007f18ac4439f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f18aa61a8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x00000000009475b8 in magma::TaskQueue::dequeue(std::atomic&) () #4 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #5 0x0000000000af6fe9 in platform_thread_wrap(void*) () #6 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #7 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 2 (Thread 0x7f183eee5700 (LWP 41884)): #0 0x00007f18ac4439f5 in pthread_cond_wait@@GLIBC_2.3.2 () from /lib64/libpthread.so.0 #1 0x00007f18aa61a8bc in __gthread_cond_wait (__mutex=, __cond=) at /tmp/deploy/objdir/x86_64-pc-linux-gnu/libstdc++-v3/include/x86_64-pc-linux-gnu/bits/gthr-default.h:865 #2 std::condition_variable::wait (this=, __lock=...) at /tmp/deploy/objdir/../gcc-10.2.0/libstdc++-v3/src/c++11/condition_variable.cc:53 #3 0x00000000009475b8 in magma::TaskQueue::dequeue(std::atomic&) () #4 0x00000000009479a1 in magma::TaskWorker::loop(void*) () #5 0x0000000000af6fe9 in platform_thread_wrap(void*) () #6 0x00007f18ac43fe65 in start_thread () from /lib64/libpthread.so.0 #7 0x00007f18a9d6888d in clone () from /lib64/libc.so.6 Thread 1 (Thread 0x7f18ad2b6c40 (LWP 41831)): #0 0x00007f18a9d68e63 in epoll_wait () from /lib64/libc.so.6 #1 0x00007f18ab7103c5 in epoll_dispatch () from /opt/couchbase/bin/../lib/libevent_core-2.1.so.7 #2 0x00007f18ab707376 in event_base_loop () from /opt/couchbase/bin/../lib/libevent_core-2.1.so.7 #3 0x0000000000b63217 in folly::EventBase::loopBody(int, bool) () #4 0x0000000000b636e6 in folly::EventBase::loop() () #5 0x0000000000b650a6 in folly::EventBase::loopForever() () #6 0x000000000055a2a8 in memcached_main(int, char**) () #7 0x00007f18a9c8c505 in __libc_start_main () from /lib64/libc.so.6 #8 0x0000000000552a81 in _start () Detaching from program: /opt/couchbase/bin/memcached, process 41831 2021-09-14 03:04:00,947 | test | CRITICAL | MainThread | [basetestcase:check_coredump_exist:905] 172.23.121.222: Found ' CRITICAL ' logs - ['[ns_server:info,2021-09-14T02:44:11.482-07:00,babysitter_of_ns_1@cb.local:<0.129.0>:ns_port_server:log:221]memcached<0.129.0>: 2021-09-14T02:44:11.281982-07:00 CRITICAL *** Fatal error encountered during exception handling ***\n', 'memcached<0.129.0>: 2021-09-14T02:44:11.282018-07:00 CRITICAL Caught unhandled std::exception-derived exception. what(): Monotonic (unlabelled) invariant failed: new value (0) breaks invariant on current value (4786)\n', 'memcached<0.129.0>: 2021-09-14T02:44:11.282022-07:00 CRITICAL Exception thrown from:\n', 'memcached<0.129.0>: 2021-09-14T02:44:11.282063-07:00 CRITICAL #0 /opt/couchbase/bin/memcached() [0x400000+0x95207]\n', 'memcached<0.129.0>: 2021-09-14T02:44:11.282138-07:00 CRITICAL #1 /opt/couchbase/bin/memcached() [0x400000+0xc3c47]\n', 'memcached<0.129.0>: 2021-09-14T02:44:11.282161-07:00 CRITICAL #2 /opt/couchbase/bin/memcached() [0x400000+0x3ef06d]\n', 'memcached<0.129.0>: 2021-09-14T02:44:11.282178-07:00 CRITICAL #3 /opt/couchbase/bin/memcached() [0x400000+0x4524f2]\n', 'memcached<0.129.0>: 2021-09-14T02:44:11.282200-07:00 CRITICAL #4 /opt/couchbase/bin/memcached() [0x400000+0x452f76]\n', 'memcached<0.129.0>: 2021-09-14T02:44:11.282213-07:00 CRITICAL #5 /opt/couchbase/bin/memcached() [0x400000+0x3eb3ba]\n', 'memcached<0.129.0>: 2021-09-14T02:44:11.282242-07:00 CRITICAL #6 /opt/couchbase/bin/memcached() [0x400000+0x3ec981]\n', 'memcached<0.129.0>: 2021-09-14T02:44:11.282262-07:00 CRITICAL #7 /opt/couchbase/bin/memcached() [0x400000+0x306666]\n', 'memcached<0.129.0>: 2021-09-14T02:44:11.282283-07:00 CRITICAL #8 /opt/couchbase/bin/memcached() [0x400000+0x60ae52]\n', 'memcached<0.129.0>: 2021-09-14T02:44:11.282308-07:00 CRITICAL #9 /opt/couchbase/bin/memcached() [0x400000+0x607f75]\n', 'memcached<0.129.0>: 2021-09-14T02:44:11.282338-07:00 CRITICAL #10 /opt/couchbase/bin/memcached() [0x400000+0x759b30]\n', 'memcached<0.129.0>: 2021-09-14T02:44:11.282366-07:00 CRITICAL #11 /opt/couchbase/bin/memcached() [0x400000+0x7418ea]\n', 'memcached<0.129.0>: 2021-09-14T02:44:11.282387-07:00 CRITICAL #12 /opt/couchbase/bin/memcached() [0x400000+0x75cae9]\n', 'memcached<0.129.0>: 2021-09-14T02:44:11.282411-07:00 CRITICAL #13 /opt/couchbase/bin/memcached() [0x400000+0x607c04]\n', 'memcached<0.129.0>: 2021-09-14T02:44:11.282457-07:00 CRITICAL #14 /opt/couchbase/bin/../lib/libstdc++.so.6() [0x7f05ff826000+0xcdd40]\n', 'memcached<0.129.0>: 2021-09-14T02:44:11.282470-07:00 CRITICAL #15 /lib64/libpthread.so.0() [0x7f060170c000+0x7e65]\n', 'memcached<0.129.0>: 2021-09-14T02:44:11.282503-07:00 CRITICAL #16 /lib64/libc.so.6(clone+0x6d) [0x7f05fef3e000+0xfe88d]\n', '[ns_server:info,2021-09-14T02:44:11.841-07:00,babysitter_of_ns_1@cb.local:<0.129.0>:ns_port_server:log:221]memcached<0.129.0>: CRITICAL Breakpad caught a crash (Couchbase version 7.1.0-1277). Writing crash dump to /opt/couchbase/var/lib/couchbase/crash/0257718a-7df2-4f8c-0eca94a4-4c092768.dmp before terminating.\n', 'memcached<0.564.0>: 2021-09-14T02:44:11.887897-07:00 CRITICAL Detected previous crash\n', 'memcached<0.564.0>: 2021-09-14T02:44:11.887943-07:00 CRITICAL Breakpad caught a crash (Couchbase version 7.1.0-1277). Writing crash dump to /opt/couchbase/var/lib/couchbase/crash/0257718a-7df2-4f8c-0eca94a4-4c092768.dmp before terminating.\n', 'memcached<0.564.0>: 2021-09-14T02:44:11.887953-07:00 CRITICAL Stack backtrace of crashed thread:\n', 'memcached<0.564.0>: 2021-09-14T02:44:11.887954-07:00 CRITICAL #0 /opt/couchbase/bin/memcached() [0x400000+0x68c438]\n', 'memcached<0.564.0>: 2021-09-14T02:44:11.887956-07:00 CRITICAL #1 /opt/couchbase/bin/memcached(_ZN15google_breakpad16ExceptionHandler12GenerateDumpEPNS0_12CrashContextE+0x3ea) [0x400000+0x6e1a4a]\n', 'memcached<0.564.0>: 2021-09-14T02:44:11.887958-07:00 CRITICAL #2 /opt/couchbase/bin/memcached(_ZN15google_breakpad16ExceptionHandler13SignalHandlerEiP9siginfo_tPv+0xb8) [0x400000+0x6e1d88]\n', 'memcached<0.564.0>: 2021-09-14T02:44:11.887959-07:00 CRITICAL #3 /lib64/libpthread.so.0() [0x7f060170c000+0xf5f0]\n', 'memcached<0.564.0>: 2021-09-14T02:44:11.887961-07:00 CRITICAL #4 /lib64/libc.so.6(gsignal+0x37) [0x7f05fef3e000+0x36337]\n', 'memcached<0.564.0>: 2021-09-14T02:44:11.887962-07:00 CRITICAL #5 /lib64/libc.so.6(abort+0x148) [0x7f05fef3e000+0x37a28]\n', 'memcached<0.564.0>: 2021-09-14T02:44:11.887963-07:00 CRITICAL #6 /opt/couchbase/bin/../lib/libstdc++.so.6() [0x7f05ff826000+0x9963c]\n', 'memcached<0.564.0>: 2021-09-14T02:44:11.887965-07:00 CRITICAL #7 /opt/couchbase/bin/memcached() [0x400000+0x699a3b]\n', 'memcached<0.564.0>: 2021-09-14T02:44:11.887966-07:00 CRITICAL #8 /opt/couchbase/bin/../lib/libstdc++.so.6() [0x7f05ff826000+0xa48f6]\n', 'memcached<0.564.0>: 2021-09-14T02:44:11.887967-07:00 CRITICAL #9 /opt/couchbase/bin/../lib/libstdc++.so.6() [0x7f05ff826000+0xa4961]\n', 'memcached<0.564.0>: 2021-09-14T02:44:11.887969-07:00 CRITICAL #10 /opt/couchbase/bin/../lib/libstdc++.so.6(__cxa_rethrow+0x46) [0x7f05ff826000+0xa4c46]\n', 'memcached<0.564.0>: 2021-09-14T02:44:11.887981-07:00 CRITICAL #11 /opt/couchbase/bin/memcached() [0x400000+0xc3efe]\n', 'memcached<0.564.0>: 2021-09-14T02:44:11.888008-07:00 CRITICAL #12 /opt/couchbase/bin/memcached() [0x400000+0x4524f2]\n', 'memcached<0.564.0>: 2021-09-14T02:44:11.888010-07:00 CRITICAL #13 /opt/couchbase/bin/memcached() [0x400000+0x452f76]\n', 'memcached<0.564.0>: 2021-09-14T02:44:11.888011-07:00 CRITICAL #14 /opt/couchbase/bin/memcached() [0x400000+0x3eb3ba]\n', 'memcached<0.564.0>: 2021-09-14T02:44:11.888015-07:00 CRITICAL #15 /opt/couchbase/bin/memcached() [0x400000+0x3ec981]\n', 'memcached<0.564.0>: 2021-09-14T02:44:11.888037-07:00 CRITICAL #16 /opt/couchbase/bin/memcached() [0x400000+0x306666]\n', 'memcached<0.564.0>: 2021-09-14T02:44:11.888039-07:00 CRITICAL #17 /opt/couchbase/bin/memcached() [0x400000+0x60ae52]\n', 'memcached<0.564.0>: 2021-09-14T02:44:11.888040-07:00 CRITICAL #18 /opt/couchbase/bin/memcached() [0x400000+0x607f75]\n', 'memcached<0.564.0>: 2021-09-14T02:44:11.888064-07:00 CRITICAL #19 /opt/couchbase/bin/memcached() [0x400000+0x759b30]\n', 'memcached<0.564.0>: 2021-09-14T02:44:11.888087-07:00 CRITICAL #20 /opt/couchbase/bin/memcached() [0x400000+0x7418ea]\n', 'memcached<0.564.0>: 2021-09-14T02:44:11.888091-07:00 CRITICAL #21 /opt/couchbase/bin/memcached() [0x400000+0x75cae9]\n', 'memcached<0.564.0>: 2021-09-14T02:44:11.888093-07:00 CRITICAL #22 /opt/couchbase/bin/memcached() [0x400000+0x607c04]\n', 'memcached<0.564.0>: 2021-09-14T02:44:11.888095-07:00 CRITICAL #23 /opt/couchbase/bin/../lib/libstdc++.so.6() [0x7f05ff826000+0xcdd40]\n', 'memcached<0.564.0>: 2021-09-14T02:44:11.888144-07:00 CRITICAL #24 /lib64/libpthread.so.0() [0x7f060170c000+0x7e65]\n', 'memcached<0.564.0>: 2021-09-14T02:44:11.888145-07:00 CRITICAL #25 /lib64/libc.so.6(clone+0x6d) [0x7f05fef3e000+0xfe88d]\n', '[ns_server:info,2021-09-14T02:45:31.801-07:00,babysitter_of_ns_1@cb.local:<0.564.0>:ns_port_server:log:221]memcached<0.564.0>: 2021-09-14T02:45:31.600101-07:00 CRITICAL *** Fatal error encountered during exception handling ***\n', 'memcached<0.564.0>: 2021-09-14T02:45:31.600159-07:00 CRITICAL Caught unhandled std::exception-derived exception. what(): Monotonic (unlabelled) invariant failed: new value (0) breaks invariant on current value (3922)\n', 'memcached<0.564.0>: 2021-09-14T02:45:31.600162-07:00 CRITICAL Exception thrown from:\n', 'memcached<0.564.0>: 2021-09-14T02:45:31.600201-07:00 CRITICAL #0 /opt/couchbase/bin/memcached() [0x400000+0x95207]\n', 'memcached<0.564.0>: 2021-09-14T02:45:31.600215-07:00 CRITICAL #1 /opt/couchbase/bin/memcached() [0x400000+0xc3c47]\n', 'memcached<0.564.0>: 2021-09-14T02:45:31.600231-07:00 CRITICAL #2 /opt/couchbase/bin/memcached() [0x400000+0x3ef06d]\n', 'memcached<0.564.0>: 2021-09-14T02:45:31.600248-07:00 CRITICAL #3 /opt/couchbase/bin/memcached() [0x400000+0x4524f2]\n', 'memcached<0.564.0>: 2021-09-14T02:45:31.600262-07:00 CRITICAL #4 /opt/couchbase/bin/memcached() [0x400000+0x452f76]\n', 'memcached<0.564.0>: 2021-09-14T02:45:31.600276-07:00 CRITICAL #5 /opt/couchbase/bin/memcached() [0x400000+0x3eb3ba]\n', 'memcached<0.564.0>: 2021-09-14T02:45:31.600289-07:00 CRITICAL #6 /opt/couchbase/bin/memcached() [0x400000+0x3ec981]\n', 'memcached<0.564.0>: 2021-09-14T02:45:31.600302-07:00 CRITICAL #7 /opt/couchbase/bin/memcached() [0x400000+0x306666]\n', 'memcached<0.564.0>: 2021-09-14T02:45:31.600322-07:00 CRITICAL #8 /opt/couchbase/bin/memcached() [0x400000+0x60ae52]\n', 'memcached<0.564.0>: 2021-09-14T02:45:31.600341-07:00 CRITICAL #9 /opt/couchbase/bin/memcached() [0x400000+0x607f75]\n', 'memcached<0.564.0>: 2021-09-14T02:45:31.600362-07:00 CRITICAL #10 /opt/couchbase/bin/memcached() [0x400000+0x759b30]\n', 'memcached<0.564.0>: 2021-09-14T02:45:31.600383-07:00 CRITICAL #11 /opt/couchbase/bin/memcached() [0x400000+0x7418ea]\n', 'memcached<0.564.0>: 2021-09-14T02:45:31.600639-07:00 CRITICAL #12 /opt/couchbase/bin/memcached() [0x400000+0x75cae9]\n', 'memcached<0.564.0>: 2021-09-14T02:45:31.600660-07:00 CRITICAL #13 /opt/couchbase/bin/memcached() [0x400000+0x607c04]\n', 'memcached<0.564.0>: 2021-09-14T02:45:31.600731-07:00 CRITICAL #14 /opt/couchbase/bin/../lib/libstdc++.so.6() [0x7fae92ea6000+0xcdd40]\n', 'memcached<0.564.0>: 2021-09-14T02:45:31.600744-07:00 CRITICAL #15 /lib64/libpthread.so.0() [0x7fae94d8c000+0x7e65]\n', "memcached<0.564.0>: terminate called after throwing an instance of '2021-09-14T02:45:31.600778-07:00 CRITICAL #16 /lib64/libc.so.6(clone+0x6d) [0x7fae925be000+0xfe88d]\n", '[ns_server:info,2021-09-14T02:45:32.110-07:00,babysitter_of_ns_1@cb.local:<0.564.0>:ns_port_server:log:221]memcached<0.564.0>: CRITICAL Breakpad caught a crash (Couchbase version 7.1.0-1277). Writing crash dump to /opt/couchbase/var/lib/couchbase/crash/7b4d4f5c-6abd-49bd-53169ba0-0841c7ce.dmp before terminating.\n', 'memcached<0.592.0>: 2021-09-14T02:45:32.148289-07:00 CRITICAL Detected previous crash\n', 'memcached<0.592.0>: 2021-09-14T02:45:32.148341-07:00 CRITICAL Breakpad caught a crash (Couchbase version 7.1.0-1277). Writing crash dump to /opt/couchbase/var/lib/couchbase/crash/7b4d4f5c-6abd-49bd-53169ba0-0841c7ce.dmp before terminating.\n', 'memcached<0.592.0>: 2021-09-14T02:45:32.148352-07:00 CRITICAL Stack backtrace of crashed thread:\n', 'memcached<0.592.0>: 2021-09-14T02:45:32.148354-07:00 CRITICAL #0 /opt/couchbase/bin/memcached() [0x400000+0x68c438]\n', 'memcached<0.592.0>: 2021-09-14T02:45:32.148356-07:00 CRITICAL #1 /opt/couchbase/bin/memcached(_ZN15google_breakpad16ExceptionHandler12GenerateDumpEPNS0_12CrashContextE+0x3ea) [0x400000+0x6e1a4a]\n', 'memcached<0.592.0>: 2021-09-14T02:45:32.148358-07:00 CRITICAL #2 /opt/couchbase/bin/memcached(_ZN15google_breakpad16ExceptionHandler13SignalHandlerEiP9siginfo_tPv+0xb8) [0x400000+0x6e1d88]\n', 'memcached<0.592.0>: 2021-09-14T02:45:32.148359-07:00 CRITICAL #3 /lib64/libpthread.so.0() [0x7fae94d8c000+0xf5f0]\n', 'memcached<0.592.0>: 2021-09-14T02:45:32.148361-07:00 CRITICAL #4 /lib64/libc.so.6(gsignal+0x37) [0x7fae925be000+0x36337]\n', 'memcached<0.592.0>: 2021-09-14T02:45:32.148362-07:00 CRITICAL #5 /lib64/libc.so.6(abort+0x148) [0x7fae925be000+0x37a28]\n', 'memcached<0.592.0>: 2021-09-14T02:45:32.148363-07:00 CRITICAL #6 /opt/couchbase/bin/../lib/libstdc++.so.6() [0x7fae92ea6000+0x9963c]\n', 'memcached<0.592.0>: 2021-09-14T02:45:32.148375-07:00 CRITICAL #7 /opt/couchbase/bin/memcached() [0x400000+0x699a3b]\n', 'memcached<0.592.0>: 2021-09-14T02:45:32.148376-07:00 CRITICAL #8 /opt/couchbase/bin/../lib/libstdc++.so.6() [0x7fae92ea6000+0xa48f6]\n', 'memcached<0.592.0>: 2021-09-14T02:45:32.148380-07:00 CRITICAL #9 /opt/couchbase/bin/../lib/libstdc++.so.6() [0x7fae92ea6000+0xa4961]\n', 'memcached<0.592.0>: 2021-09-14T02:45:32.148382-07:00 CRITICAL #10 /opt/couchbase/bin/../lib/libstdc++.so.6(__cxa_rethrow+0x46) [0x7fae92ea6000+0xa4c46]\n', 'memcached<0.592.0>: 2021-09-14T02:45:32.148384-07:00 CRITICAL #11 /opt/couchbase/bin/memcached() [0x400000+0xc3efe]\n', 'memcached<0.592.0>: 2021-09-14T02:45:32.148385-07:00 CRITICAL #12 /opt/couchbase/bin/memcached() [0x400000+0x4524f2]\n', 'memcached<0.592.0>: 2021-09-14T02:45:32.148419-07:00 CRITICAL #13 /opt/couchbase/bin/memcached() [0x400000+0x452f76]\n', 'memcached<0.592.0>: 2021-09-14T02:45:32.148460-07:00 CRITICAL #14 /opt/couchbase/bin/memcached() [0x400000+0x3eb3ba]\n', 'memcached<0.592.0>: 2021-09-14T02:45:32.148464-07:00 CRITICAL #15 /opt/couchbase/bin/memcached() [0x400000+0x3ec981]\n', 'memcached<0.592.0>: 2021-09-14T02:45:32.148466-07:00 CRITICAL #16 /opt/couchbase/bin/memcached() [0x400000+0x306666]\n', 'memcached<0.592.0>: 2021-09-14T02:45:32.148468-07:00 CRITICAL #17 /opt/couchbase/bin/memcached() [0x400000+0x60ae52]\n', 'memcached<0.592.0>: 2021-09-14T02:45:32.148469-07:00 CRITICAL #18 /opt/couchbase/bin/memcached() [0x400000+0x607f75]\n', 'memcached<0.592.0>: 2021-09-14T02:45:32.148490-07:00 CRITICAL #19 /opt/couchbase/bin/memcached() [0x400000+0x759b30]\n', 'memcached<0.592.0>: 2021-09-14T02:45:32.148493-07:00 CRITICAL #20 /opt/couchbase/bin/memcached() [0x400000+0x7418ea]\n', 'memcached<0.592.0>: 2021-09-14T02:45:32.148496-07:00 CRITICAL #21 /opt/couchbase/bin/memcached() [0x400000+0x75cae9]\n', 'memcached<0.592.0>: 2021-09-14T02:45:32.148499-07:00 CRITICAL #22 /opt/couchbase/bin/memcached() [0x400000+0x607c04]\n', 'memcached<0.592.0>: 2021-09-14T02:45:32.148502-07:00 CRITICAL #23 /opt/couchbase/bin/../lib/libstdc++.so.6() [0x7fae92ea6000+0xcdd40]\n', 'memcached<0.592.0>: 2021-09-14T02:45:32.148504-07:00 CRITICAL #24 /lib64/libpthread.so.0() [0x7fae94d8c000+0x7e65]\n', 'memcached<0.592.0>: 2021-09-14T02:45:32.148505-07:00 CRITICAL #25 /lib64/libc.so.6(clone+0x6d) [0x7fae925be000+0xfe88d]\n', '[ns_server:info,2021-09-14T03:03:46.723-07:00,babysitter_of_ns_1@cb.local:<0.129.0>:ns_port_server:log:221]memcached<0.129.0>: 2021-09-14T03:03:46.522726-07:00 CRITICAL *** Fatal error encountered during exception handling ***\n', 'memcached<0.129.0>: 2021-09-14T03:03:46.522775-07:00 CRITICAL Caught unhandled std::exception-derived exception. what(): Monotonic (unlabelled) invariant failed: new value (0) breaks invariant on current value (4837)\n', 'memcached<0.129.0>: 2021-09-14T03:03:46.522778-07:00 CRITICAL Exception thrown from:\n', 'memcached<0.129.0>: 2021-09-14T03:03:46.522942-07:00 CRITICAL #0 /opt/couchbase/bin/memcached() [0x400000+0x95207]\n', 'memcached<0.129.0>: 2021-09-14T03:03:46.522974-07:00 CRITICAL #1 /opt/couchbase/bin/memcached() [0x400000+0xc3c47]\n', 'memcached<0.129.0>: 2021-09-14T03:03:46.523077-07:00 CRITICAL #2 /opt/couchbase/bin/memcached() [0x400000+0x3ef06d]\n', 'memcached<0.129.0>: 2021-09-14T03:03:46.523105-07:00 CRITICAL #3 /opt/couchbase/bin/memcached() [0x400000+0x4524f2]\n', 'memcached<0.129.0>: 2021-09-14T03:03:46.523119-07:00 CRITICAL #4 /opt/couchbase/bin/memcached() [0x400000+0x452f76]\n', 'memcached<0.129.0>: 2021-09-14T03:03:46.523134-07:00 CRITICAL #5 /opt/couchbase/bin/memcached() [0x400000+0x3eb3ba]\n', 'memcached<0.129.0>: 2021-09-14T03:03:46.523199-07:00 CRITICAL #6 /opt/couchbase/bin/memcached() [0x400000+0x3ec981]\n', 'memcached<0.129.0>: 2021-09-14T03:03:46.523223-07:00 CRITICAL #7 /opt/couchbase/bin/memcached() [0x400000+0x306666]\n', 'memcached<0.129.0>: 2021-09-14T03:03:46.523283-07:00 CRITICAL #8 /opt/couchbase/bin/memcached() [0x400000+0x60ae52]\n', 'memcached<0.129.0>: 2021-09-14T03:03:46.523318-07:00 CRITICAL #9 /opt/couchbase/bin/memcached() [0x400000+0x607f75]\n', 'memcached<0.129.0>: 2021-09-14T03:03:46.523384-07:00 CRITICAL #10 /opt/couchbase/bin/memcached() [0x400000+0x759b30]\n', 'memcached<0.129.0>: 2021-09-14T03:03:46.523415-07:00 CRITICAL #11 /opt/couchbase/bin/memcached() [0x400000+0x7418ea]\n', 'memcached<0.129.0>: 2021-09-14T03:03:46.523475-07:00 CRITICAL #12 /opt/couchbase/bin/memcached() [0x400000+0x75cae9]\n', 'memcached<0.129.0>: 2021-09-14T03:03:46.523505-07:00 CRITICAL #13 /opt/couchbase/bin/memcached() [0x400000+0x607c04]\n', 'memcached<0.129.0>: 2021-09-14T03:03:46.523585-07:00 CRITICAL #14 /opt/couchbase/bin/../lib/libstdc++.so.6() [0x7fa8dfdab000+0xcdd40]\n', 'memcached<0.129.0>: 2021-09-14T03:03:46.523617-07:00 CRITICAL #15 /lib64/libpthread.so.0() [0x7fa8e1c91000+0x7e65]\n', 'memcached<0.129.0>: 2021-09-14T03:03:46.523693-07:00 CRITICAL #16 /lib64/libc.so.6(clone+0x6d) [0x7fa8df4c3000+0xfe88d]\n', '[ns_server:info,2021-09-14T03:03:47.244-07:00,babysitter_of_ns_1@cb.local:<0.129.0>:ns_port_server:log:221]memcached<0.129.0>: CRITICAL Breakpad caught a crash (Couchbase version 7.1.0-1277). Writing crash dump to /opt/couchbase/var/lib/couchbase/crash/53b2e9b9-7a98-45a5-ad5431aa-ef85228a.dmp before terminating.\n', 'memcached<0.563.0>: 2021-09-14T03:03:47.370337-07:00 CRITICAL Detected previous crash\n', 'memcached<0.563.0>: 2021-09-14T03:03:47.370400-07:00 CRITICAL Breakpad caught a crash (Couchbase version 7.1.0-1277). Writing crash dump to /opt/couchbase/var/lib/couchbase/crash/53b2e9b9-7a98-45a5-ad5431aa-ef85228a.dmp before terminating.\n', 'memcached<0.563.0>: 2021-09-14T03:03:47.370410-07:00 CRITICAL Stack backtrace of crashed thread:\n', 'memcached<0.563.0>: 2021-09-14T03:03:47.370412-07:00 CRITICAL #0 /opt/couchbase/bin/memcached() [0x400000+0x68c438]\n', 'memcached<0.563.0>: 2021-09-14T03:03:47.370414-07:00 CRITICAL #1 /opt/couchbase/bin/memcached(_ZN15google_breakpad16ExceptionHandler12GenerateDumpEPNS0_12CrashContextE+0x3ea) [0x400000+0x6e1a4a]\n', 'memcached<0.563.0>: 2021-09-14T03:03:47.370415-07:00 CRITICAL #2 /opt/couchbase/bin/memcached(_ZN15google_breakpad16ExceptionHandler13SignalHandlerEiP9siginfo_tPv+0xb8) [0x400000+0x6e1d88]\n', 'memcached<0.563.0>: 2021-09-14T03:03:47.370418-07:00 CRITICAL #3 /lib64/libpthread.so.0() [0x7fa8e1c91000+0xf5f0]\n', 'memcached<0.563.0>: 2021-09-14T03:03:47.370420-07:00 CRITICAL #4 /lib64/libc.so.6(gsignal+0x37) [0x7fa8df4c3000+0x36337]\n', 'memcached<0.563.0>: 2021-09-14T03:03:47.370422-07:00 CRITICAL #5 /lib64/libc.so.6(abort+0x148) [0x7fa8df4c3000+0x37a28]\n', 'memcached<0.563.0>: 2021-09-14T03:03:47.370425-07:00 CRITICAL #6 /opt/couchbase/bin/../lib/libstdc++.so.6() [0x7fa8dfdab000+0x9963c]\n', 'memcached<0.563.0>: 2021-09-14T03:03:47.370427-07:00 CRITICAL #7 /opt/couchbase/bin/memcached() [0x400000+0x699a3b]\n', 'memcached<0.563.0>: 2021-09-14T03:03:47.370428-07:00 CRITICAL #8 /opt/couchbase/bin/../lib/libstdc++.so.6() [0x7fa8dfdab000+0xa48f6]\n', 'memcached<0.563.0>: 2021-09-14T03:03:47.370429-07:00 CRITICAL #9 /opt/couchbase/bin/../lib/libstdc++.so.6() [0x7fa8dfdab000+0xa4961]\n', 'memcached<0.563.0>: 2021-09-14T03:03:47.370434-07:00 CRITICAL #10 /opt/couchbase/bin/../lib/libstdc++.so.6(__cxa_rethrow+0x46) [0x7fa8dfdab000+0xa4c46]\n', 'memcached<0.563.0>: 2021-09-14T03:03:47.370499-07:00 CRITICAL #11 /opt/couchbase/bin/memcached() [0x400000+0xc3efe]\n', 'memcached<0.563.0>: 2021-09-14T03:03:47.370502-07:00 CRITICAL #12 /opt/couchbase/bin/memcached() [0x400000+0x4524f2]\n', 'memcached<0.563.0>: 2021-09-14T03:03:47.370504-07:00 CRITICAL #13 /opt/couchbase/bin/memcached() [0x400000+0x452f76]\n', 'memcached<0.563.0>: 2021-09-14T03:03:47.370505-07:00 CRITICAL #14 /opt/couchbase/bin/memcached() [0x400000+0x3eb3ba]\n', 'memcached<0.563.0>: 2021-09-14T03:03:47.370507-07:00 CRITICAL #15 /opt/couchbase/bin/memcached() [0x400000+0x3ec981]\n', 'memcached<0.563.0>: 2021-09-14T03:03:47.370508-07:00 CRITICAL #16 /opt/couchbase/bin/memcached() [0x400000+0x306666]\n', 'memcached<0.563.0>: 2021-09-14T03:03:47.370509-07:00 CRITICAL #17 /opt/couchbase/bin/memcached() [0x400000+0x60ae52]\n', 'memcached<0.563.0>: 2021-09-14T03:03:47.370690-07:00 CRITICAL #18 /opt/couchbase/bin/memcached() [0x400000+0x607f75]\n', 'memcached<0.563.0>: 2021-09-14T03:03:47.370691-07:00 CRITICAL #19 /opt/couchbase/bin/memcached() [0x400000+0x759b30]\n', 'memcached<0.563.0>: 2021-09-14T03:03:47.370693-07:00 CRITICAL #20 /opt/couchbase/bin/memcached() [0x400000+0x7418ea]\n', 'memcached<0.563.0>: 2021-09-14T03:03:47.370694-07:00 CRITICAL #21 /opt/couchbase/bin/memcached() [0x400000+0x75cae9]\n', 'memcached<0.563.0>: 2021-09-14T03:03:47.370696-07:00 CRITICAL #22 /opt/couchbase/bin/memcached() [0x400000+0x607c04]\n', 'memcached<0.563.0>: 2021-09-14T03:03:47.370697-07:00 CRITICAL #23 /opt/couchbase/bin/../lib/libstdc++.so.6() [0x7fa8dfdab000+0xcdd40]\n', 'memcached<0.563.0>: 2021-09-14T03:03:47.370699-07:00 CRITICAL #24 /lib64/libpthread.so.0() [0x7fa8e1c91000+0x7e65]\n', 'memcached<0.563.0>: 2021-09-14T03:03:47.370700-07:00 CRITICAL #25 /lib64/libc.so.6(clone+0x6d) [0x7fa8df4c3000+0xfe88d]\n'] ERROR ====================================================================== ERROR: test_data_load_collections_with_hard_failover_recovery (bucket_collections.collections_rebalance.CollectionsRebalance) ---------------------------------------------------------------------- Traceback (most recent call last): File "pytests/bucket_collections/collections_rebalance.py", line 95, in tearDown super(CollectionsRebalance, self).tearDown() File "pytests/bucket_collections/collections_base.py", line 92, in tearDown super(CollectionBase, self).tearDown() File "pytests/basetestcase.py", line 1051, in tearDown super(ClusterSetup, self).tearDown() File "pytests/basetestcase.py", line 529, in tearDown self.assertFalse(result, msg="Cb_log file validation failed") AssertionError: Cb_log file validation failed ---------------------------------------------------------------------- Ran 1 test in 1211.889s During the test, Remote Connections: 90, Disconnections: 90 FAILED (errors=1) SDK Connections: 38, Disconnections: 38 summary so far suite bucket_collections.collections_rebalance.CollectionsRebalance , pass 0 , fail 4 failures so far... bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_graceful_failover_recovery bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_graceful_failover_recovery bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_hard_failover_recovery bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_hard_failover_recovery testrunner logs, diags and results are available under /data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/logs/testrunner-21-Sep-14_02-20-11/test_4 Test 'bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_graceful_failover_rebalance_out,nodes_init=3,nodes_failover=1,override_spec_params=durability;replicas,durability=MAJORITY,replicas=2,bucket_spec=dgm.buckets_for_rebalance_tests,data_load_stage=during,dgm=70,skip_validations=False,GROUP=durability_majority_dgm' skipped, GROUP not satisfied Test 'bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_hard_failover_rebalance_out,nodes_init=3,nodes_failover=1,override_spec_params=durability;replicas,durability=MAJORITY,replicas=2,bucket_spec=dgm.buckets_for_rebalance_tests,data_load_stage=during,dgm=70,skip_validations=False,GROUP=durability_majority_dgm' skipped, GROUP not satisfied Test 'bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_graceful_failover_recovery,nodes_init=3,nodes_failover=1,recovery_type=full,override_spec_params=durability;replicas,durability=MAJORITY,replicas=2,bucket_spec=dgm.buckets_for_rebalance_tests,data_load_stage=during,dgm=70,skip_validations=False,GROUP=durability_majority_dgm' skipped, GROUP not satisfied Test 'bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_graceful_failover_recovery,nodes_init=3,nodes_failover=1,recovery_type=delta,override_spec_params=durability;replicas,durability=MAJORITY,replicas=2,bucket_spec=dgm.buckets_for_rebalance_tests,data_load_stage=during,dgm=70,skip_validations=False,GROUP=durability_majority_dgm' skipped, GROUP not satisfied Test 'bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_hard_failover_recovery,nodes_init=3,nodes_failover=1,recovery_type=full,override_spec_params=durability;replicas,durability=MAJORITY,replicas=2,bucket_spec=dgm.buckets_for_rebalance_tests,data_load_stage=during,dgm=70,skip_validations=False,GROUP=durability_majority_dgm' skipped, GROUP not satisfied Test 'bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_hard_failover_recovery,nodes_init=3,nodes_failover=1,recovery_type=delta,override_spec_params=durability;replicas,durability=MAJORITY,replicas=2,bucket_spec=dgm.buckets_for_rebalance_tests,data_load_stage=during,dgm=70,skip_validations=False,GROUP=durability_majority_dgm' skipped, GROUP not satisfied Test 'bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_graceful_failover_rebalance_out,nodes_init=3,nodes_failover=1,override_spec_params=durability;replicas,durability= MAJORITY_AND_PERSIST_TO_ACTIVE,replicas=2,bucket_spec=dgm.buckets_for_rebalance_tests,data_load_stage=during,dgm=70,skip_validations=False,GROUP=durability_majority_and_persist_active_dgm' skipped, GROUP not satisfied Test 'bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_hard_failover_rebalance_out,nodes_init=3,nodes_failover=1,override_spec_params=durability;replicas,durability= MAJORITY_AND_PERSIST_TO_ACTIVE,replicas=2,bucket_spec=dgm.buckets_for_rebalance_tests,data_load_stage=during,dgm=70,skip_validations=False,GROUP=durability_majority_and_persist_active_dgm' skipped, GROUP not satisfied Test 'bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_graceful_failover_recovery,nodes_init=3,nodes_failover=1,recovery_type=full,override_spec_params=durability;replicas,durability= MAJORITY_AND_PERSIST_TO_ACTIVE,replicas=2,bucket_spec=dgm.buckets_for_rebalance_tests,data_load_stage=during,dgm=70,skip_validations=False,GROUP=durability_majority_and_persist_active_dgm' skipped, GROUP not satisfied Test 'bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_graceful_failover_recovery,nodes_init=3,nodes_failover=1,recovery_type=delta,override_spec_params=durability;replicas,durability= MAJORITY_AND_PERSIST_TO_ACTIVE,replicas=2,bucket_spec=dgm.buckets_for_rebalance_tests,data_load_stage=during,dgm=70,skip_validations=False,GROUP=durability_majority_and_persist_active_dgm' skipped, GROUP not satisfied Test 'bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_hard_failover_recovery,nodes_init=3,nodes_failover=1,recovery_type=full,override_spec_params=durability;replicas,durability= MAJORITY_AND_PERSIST_TO_ACTIVE,replicas=2,bucket_spec=dgm.buckets_for_rebalance_tests,data_load_stage=during,dgm=70,skip_validations=False,GROUP=durability_majority_and_persist_active_dgm' skipped, GROUP not satisfied Test 'bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_hard_failover_recovery,nodes_init=3,nodes_failover=1,recovery_type=delta,override_spec_params=durability;replicas,durability= MAJORITY_AND_PERSIST_TO_ACTIVE,replicas=2,bucket_spec=dgm.buckets_for_rebalance_tests,data_load_stage=during,dgm=70,skip_validations=False,GROUP=durability_majority_and_persist_active_dgm' skipped, GROUP not satisfied Test 'bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_graceful_failover_rebalance_out,nodes_init=3,nodes_failover=1,override_spec_params=durability;replicas,durability=PERSIST_TO_MAJORITY,replicas=2,bucket_spec=dgm.buckets_for_rebalance_tests,data_load_stage=during,dgm=70,disk_optimized_thread_settings=True,skip_validations=False,GROUP=durability_persist_to_majority_dgm' skipped, GROUP not satisfied Test 'bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_hard_failover_rebalance_out,nodes_init=3,nodes_failover=1,override_spec_params=durability;replicas,durability=PERSIST_TO_MAJORITY,replicas=2,bucket_spec=dgm.buckets_for_rebalance_tests,data_load_stage=during,dgm=70,disk_optimized_thread_settings=True,skip_validations=False,GROUP=durability_persist_to_majority_dgm' skipped, GROUP not satisfied Test 'bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_graceful_failover_recovery,nodes_init=3,nodes_failover=1,recovery_type=full,override_spec_params=durability;replicas,durability=PERSIST_TO_MAJORITY,replicas=2,bucket_spec=dgm.buckets_for_rebalance_tests,data_load_stage=during,dgm=70,disk_optimized_thread_settings=True,skip_validations=False,GROUP=durability_persist_to_majority_dgm' skipped, GROUP not satisfied Test 'bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_graceful_failover_recovery,nodes_init=3,nodes_failover=1,recovery_type=delta,override_spec_params=durability;replicas,durability=PERSIST_TO_MAJORITY,replicas=2,bucket_spec=dgm.buckets_for_rebalance_tests,data_load_stage=during,dgm=70,disk_optimized_thread_settings=True,skip_validations=False,GROUP=durability_persist_to_majority_dgm' skipped, GROUP not satisfied Test 'bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_hard_failover_recovery,nodes_init=3,nodes_failover=1,recovery_type=full,override_spec_params=durability;replicas,durability=PERSIST_TO_MAJORITY,replicas=2,bucket_spec=dgm.buckets_for_rebalance_tests,data_load_stage=during,dgm=70,disk_optimized_thread_settings=True,skip_validations=False,GROUP=durability_persist_to_majority_dgm' skipped, GROUP not satisfied Test 'bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_hard_failover_recovery,nodes_init=3,nodes_failover=1,recovery_type=delta,override_spec_params=durability;replicas,durability=PERSIST_TO_MAJORITY,replicas=2,bucket_spec=dgm.buckets_for_rebalance_tests,data_load_stage=during,dgm=70,disk_optimized_thread_settings=True,skip_validations=False,GROUP=durability_persist_to_majority_dgm' skipped, GROUP not satisfied Test 'bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_graceful_failover_rebalance_out,nodes_init=3,nodes_failover=1,bucket_spec=dgm.buckets_for_rebalance_tests_with_ttl,data_load_spec=ttl_load,data_load_stage=during,dgm_ttl_test=True,dgm=50,disk_optimized_thread_settings=True,skip_validations=False,GROUP=collections_maxt_ttl_dgm' skipped, GROUP not satisfied Test 'bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_hard_failover_rebalance_out,nodes_init=3,nodes_failover=1,bucket_spec=dgm.buckets_for_rebalance_tests_with_ttl,data_load_spec=ttl_load,data_load_stage=during,dgm_ttl_test=True,dgm=50,disk_optimized_thread_settings=True,skip_validations=False,GROUP=collections_maxt_ttl_dgm' skipped, GROUP not satisfied Test 'bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_graceful_failover_recovery,nodes_init=3,nodes_failover=1,recovery_type=full,bucket_spec=dgm.buckets_for_rebalance_tests_with_ttl,data_load_spec=ttl_load,data_load_stage=during,dgm_ttl_test=True,dgm=50,disk_optimized_thread_settings=True,skip_validations=False,GROUP=collections_maxt_ttl_dgm' skipped, GROUP not satisfied Test 'bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_graceful_failover_recovery,nodes_init=3,nodes_failover=1,recovery_type=delta,bucket_spec=dgm.buckets_for_rebalance_tests_with_ttl,data_load_spec=ttl_load,data_load_stage=during,dgm_ttl_test=True,dgm=50,skip_validations=False,GROUP=collections_maxt_ttl_dgm' skipped, GROUP not satisfied Test 'bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_hard_failover_recovery,nodes_init=3,nodes_failover=1,recovery_type=full,bucket_spec=dgm.buckets_for_rebalance_tests_with_ttl,data_load_spec=ttl_load,data_load_stage=during,dgm_ttl_test=True,dgm=50,skip_validations=False,GROUP=collections_maxt_ttl_dgm' skipped, GROUP not satisfied Test 'bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_hard_failover_recovery,nodes_init=3,nodes_failover=1,recovery_type=delta,bucket_spec=dgm.buckets_for_rebalance_tests_with_ttl,data_load_spec=ttl_load,data_load_stage=during,dgm_ttl_test=True,dgm=50,skip_validations=False,GROUP=collections_maxt_ttl_dgm' skipped, GROUP not satisfied Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.9/userguide/command_line_interface.html#sec:command_line_warnings BUILD SUCCESSFUL in 49m 43s 2 actionable tasks: 2 executed workspace is /data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1 fails is 4 4 Desc1: 7.1.0-1277 - collections failover_and_recovery_dgm_7.0_P1 - centos (0/4) > Configure project : Executing 'gradle clean' Using Transaction_client :: 1.1.8 Using Java_client :: 3.1.6 Running: /opt/jython/bin/jython -J-cp /data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/build/classes/java/main:/data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/build/resources/main:/root/.gradle/caches/modules-2/files-2.1/com.jcraft/jsch/0.1.54/da3584329a263616e277e15462b387addd1b208d/jsch-0.1.54.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-slf4j-impl/2.11.1/4b41b53a3a2d299ce381a69d165381ca19f62912/log4j-slf4j-impl-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/commons-cli/commons-cli/1.4/c51c00206bb913cd8612b24abd9fa98ae89719b1/commons-cli-1.4.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/couchbase-transactions/1.1.8/e69a6013e59f498f76671c1acc43df14f1163180/couchbase-transactions-1.1.8.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/java-client/3.1.6/f065d71963e08bd5577838592e33f2ca35f5d64a/java-client-3.1.6.jar:/root/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-api/1.7.25/da76ca59f6a57ee3102f8f9bd9cee742973efa8a/slf4j-api-1.7.25.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-core/2.11.1/592a48674c926b01a9a747c7831bcd82a9e6d6e4/log4j-core-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-api/2.11.1/268f0fe4df3eefe052b57c87ec48517d64fb2a10/log4j-api-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/core-io/2.1.6/b3ece73ab7069b1e97669500edc448163c2f4304/core-io-2.1.6.jar:/root/.gradle/caches/modules-2/files-2.1/io.projectreactor/reactor-core/3.4.6/d8d52418db9eea651d4772a619d5c9ea820449b7/reactor-core-3.4.6.jar:/root/.gradle/caches/modules-2/files-2.1/org.reactivestreams/reactive-streams/1.0.3/d9fb7a7926ffa635b3dcaa5049fb2bfa25b3e7d0/reactive-streams-1.0.3.jar scripts/ssh.py 7.1.0-1277 --executor_jenkins_job --run_params=GROUP=P0_failover_and_recovery_dgm,rerun=False,get-cbcollect-info=True,infra_log_level=critical,log_level=error,bucket_storage=magma,enable_dp=True,upgrade_version=7.1.0-1277 Running: /opt/jython/bin/jython -J-cp /data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/build/classes/java/main:/data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/build/resources/main:/root/.gradle/caches/modules-2/files-2.1/com.jcraft/jsch/0.1.54/da3584329a263616e277e15462b387addd1b208d/jsch-0.1.54.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-slf4j-impl/2.11.1/4b41b53a3a2d299ce381a69d165381ca19f62912/log4j-slf4j-impl-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/commons-cli/commons-cli/1.4/c51c00206bb913cd8612b24abd9fa98ae89719b1/commons-cli-1.4.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/couchbase-transactions/1.1.8/e69a6013e59f498f76671c1acc43df14f1163180/couchbase-transactions-1.1.8.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/java-client/3.1.6/f065d71963e08bd5577838592e33f2ca35f5d64a/java-client-3.1.6.jar:/root/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-api/1.7.25/da76ca59f6a57ee3102f8f9bd9cee742973efa8a/slf4j-api-1.7.25.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-core/2.11.1/592a48674c926b01a9a747c7831bcd82a9e6d6e4/log4j-core-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-api/2.11.1/268f0fe4df3eefe052b57c87ec48517d64fb2a10/log4j-api-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/core-io/2.1.6/b3ece73ab7069b1e97669500edc448163c2f4304/core-io-2.1.6.jar:/root/.gradle/caches/modules-2/files-2.1/io.projectreactor/reactor-core/3.4.6/d8d52418db9eea651d4772a619d5c9ea820449b7/reactor-core-3.4.6.jar:/root/.gradle/caches/modules-2/files-2.1/org.reactivestreams/reactive-streams/1.0.3/d9fb7a7926ffa635b3dcaa5049fb2bfa25b3e7d0/reactive-streams-1.0.3.jar scripts/eagles_all_around.py 7.1.0-1277 --executor_jenkins_job --run_params=GROUP=P0_failover_and_recovery_dgm,rerun=False,get-cbcollect-info=True,infra_log_level=critical,log_level=error,bucket_storage=magma,enable_dp=True,upgrade_version=7.1.0-1277 Running: /opt/jython/bin/jython -J-cp /data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/build/classes/java/main:/data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/build/resources/main:/root/.gradle/caches/modules-2/files-2.1/com.jcraft/jsch/0.1.54/da3584329a263616e277e15462b387addd1b208d/jsch-0.1.54.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-slf4j-impl/2.11.1/4b41b53a3a2d299ce381a69d165381ca19f62912/log4j-slf4j-impl-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/commons-cli/commons-cli/1.4/c51c00206bb913cd8612b24abd9fa98ae89719b1/commons-cli-1.4.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/couchbase-transactions/1.1.8/e69a6013e59f498f76671c1acc43df14f1163180/couchbase-transactions-1.1.8.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/java-client/3.1.6/f065d71963e08bd5577838592e33f2ca35f5d64a/java-client-3.1.6.jar:/root/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-api/1.7.25/da76ca59f6a57ee3102f8f9bd9cee742973efa8a/slf4j-api-1.7.25.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-core/2.11.1/592a48674c926b01a9a747c7831bcd82a9e6d6e4/log4j-core-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-api/2.11.1/268f0fe4df3eefe052b57c87ec48517d64fb2a10/log4j-api-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/core-io/2.1.6/b3ece73ab7069b1e97669500edc448163c2f4304/core-io-2.1.6.jar:/root/.gradle/caches/modules-2/files-2.1/io.projectreactor/reactor-core/3.4.6/d8d52418db9eea651d4772a619d5c9ea820449b7/reactor-core-3.4.6.jar:/root/.gradle/caches/modules-2/files-2.1/org.reactivestreams/reactive-streams/1.0.3/d9fb7a7926ffa635b3dcaa5049fb2bfa25b3e7d0/reactive-streams-1.0.3.jar:build/classes/java/main scripts/install.py 7.1.0-1277 --executor_jenkins_job --run_params=GROUP=P0_failover_and_recovery_dgm,rerun=False,get-cbcollect-info=True,infra_log_level=critical,log_level=error,bucket_storage=magma,enable_dp=True,upgrade_version=7.1.0-1277 Running: /opt/jython/bin/jython -J-cp /data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/build/classes/java/main:/data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/build/resources/main:/root/.gradle/caches/modules-2/files-2.1/com.jcraft/jsch/0.1.54/da3584329a263616e277e15462b387addd1b208d/jsch-0.1.54.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-slf4j-impl/2.11.1/4b41b53a3a2d299ce381a69d165381ca19f62912/log4j-slf4j-impl-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/commons-cli/commons-cli/1.4/c51c00206bb913cd8612b24abd9fa98ae89719b1/commons-cli-1.4.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/couchbase-transactions/1.1.8/e69a6013e59f498f76671c1acc43df14f1163180/couchbase-transactions-1.1.8.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/java-client/3.1.6/f065d71963e08bd5577838592e33f2ca35f5d64a/java-client-3.1.6.jar:/root/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-api/1.7.25/da76ca59f6a57ee3102f8f9bd9cee742973efa8a/slf4j-api-1.7.25.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-core/2.11.1/592a48674c926b01a9a747c7831bcd82a9e6d6e4/log4j-core-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-api/2.11.1/268f0fe4df3eefe052b57c87ec48517d64fb2a10/log4j-api-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/core-io/2.1.6/b3ece73ab7069b1e97669500edc448163c2f4304/core-io-2.1.6.jar:/root/.gradle/caches/modules-2/files-2.1/io.projectreactor/reactor-core/3.4.6/d8d52418db9eea651d4772a619d5c9ea820449b7/reactor-core-3.4.6.jar:/root/.gradle/caches/modules-2/files-2.1/org.reactivestreams/reactive-streams/1.0.3/d9fb7a7926ffa635b3dcaa5049fb2bfa25b3e7d0/reactive-streams-1.0.3.jar:build/classes/java/main:src/main/resources testrunner.py 7.1.0-1277 --executor_jenkins_job --run_params=GROUP=P0_failover_and_recovery_dgm,rerun=False,get-cbcollect-info=True,infra_log_level=critical,log_level=error,bucket_storage=magma,enable_dp=True,upgrade_version=7.1.0-1277 Running: /opt/jython/bin/jython -J-cp /data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/build/classes/java/main:/data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/build/resources/main:/root/.gradle/caches/modules-2/files-2.1/com.jcraft/jsch/0.1.54/da3584329a263616e277e15462b387addd1b208d/jsch-0.1.54.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-slf4j-impl/2.11.1/4b41b53a3a2d299ce381a69d165381ca19f62912/log4j-slf4j-impl-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/commons-cli/commons-cli/1.4/c51c00206bb913cd8612b24abd9fa98ae89719b1/commons-cli-1.4.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/couchbase-transactions/1.1.8/e69a6013e59f498f76671c1acc43df14f1163180/couchbase-transactions-1.1.8.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/java-client/3.1.6/f065d71963e08bd5577838592e33f2ca35f5d64a/java-client-3.1.6.jar:/root/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-api/1.7.25/da76ca59f6a57ee3102f8f9bd9cee742973efa8a/slf4j-api-1.7.25.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-core/2.11.1/592a48674c926b01a9a747c7831bcd82a9e6d6e4/log4j-core-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-api/2.11.1/268f0fe4df3eefe052b57c87ec48517d64fb2a10/log4j-api-2.11.1.jar:/root/.gradle/caches/modules-2/files-2.1/com.couchbase.client/core-io/2.1.6/b3ece73ab7069b1e97669500edc448163c2f4304/core-io-2.1.6.jar:/root/.gradle/caches/modules-2/files-2.1/io.projectreactor/reactor-core/3.4.6/d8d52418db9eea651d4772a619d5c9ea820449b7/reactor-core-3.4.6.jar:/root/.gradle/caches/modules-2/files-2.1/org.reactivestreams/reactive-streams/1.0.3/d9fb7a7926ffa635b3dcaa5049fb2bfa25b3e7d0/reactive-streams-1.0.3.jar:build/classes/java/main:src/main/resources scripts/rerun_jobs.py 7.1.0-1277 --executor_jenkins_job --run_params=GROUP=P0_failover_and_recovery_dgm,rerun=False,get-cbcollect-info=True,infra_log_level=critical,log_level=error,bucket_storage=magma,enable_dp=True,upgrade_version=7.1.0-1277 > Task :compileJava UP-TO-DATE > Task :rerun_job reading from centos_collections_failover_and_recovery_dgm_7.0_P1_7.1.0-1277 com.couchbase.client.core.error.DocumentNotFoundException: Document with the given id not found {"completed":true,"coreId":"0x7e653c1000000001","idempotent":true,"lastChannelId":"7E653C1000000001/000000001559B0EA","lastDispatchedFrom":"172.23.123.71:46740","lastDispatchedTo":"172.23.98.63:11210","requestId":43,"requestType":"GetRequest","retried":0,"service":{"bucket":"rerun_jobs","collection":"_default","documentId":"centos_collections_failover_and_recovery_dgm_7.0_P1_7.1.0-1277","opaque":"0x4b","scope":"_default","type":"kv"},"status":"NOT_FOUND","timeoutMs":2500,"timings":{"dispatchMicros":1118,"totalDispatchMicros":1118,"totalServerMicros":0,"totalMicros":8307,"serverMicros":0}} upserted centos_collections_failover_and_recovery_dgm_7.0_P1_7.1.0-1277 INFO:merge_reports:Merging of report files from logs/**/*.xml INFO:merge_reports:-- logs/testrunner-21-Sep-14_02-20-11/report-21-Sep-14_02-20-11-bucket_collections.collections_rebalance.CollectionsRebalance.xml -- INFO:merge_reports: Number of TestSuites=1 INFO:merge_reports: TestSuite#1) bucket_collections.collections_rebalance.CollectionsRebalance, Number of Tests=4 summary so far suite bucket_collections.collections_rebalance.CollectionsRebalance , pass 0 , fail 4 failures so far... bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_graceful_failover_recovery bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_graceful_failover_recovery bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_hard_failover_recovery bucket_collections.collections_rebalance.CollectionsRebalance.test_data_load_collections_with_hard_failover_recovery INFO:merge_reports:Summary file is at /data/workspace/centos-p0-collections-vset00-00-failover_and_recovery_dgm_7.0_P1/logs/testrunner-21-Sep-14_03-10-02/merged_summary/mergedreport-21-Sep-14_03-10-02-bucket_collections.collections_rebalance.CollectionsRebalance.xml No more failed tests. Stopping reruns Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.9/userguide/command_line_interface.html#sec:command_line_warnings BUILD SUCCESSFUL in 17s 2 actionable tasks: 1 executed, 1 up-to-date [description-setter] Description set: 7.1.0-1277 - collections failover_and_recovery_dgm_7.0_P1 - centos (0/4) [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties file path 'propfile' [EnvInject] - Variables injected successfully. Archiving artifacts Recording test results Build step 'Publish JUnit test result report' changed build result to UNSTABLE [BFA] Scanning build for known causes... [BFA] No failure causes found [BFA] Done. 0s Notifying upstream projects of job completion Email was triggered for: Unstable (Test Failures) Sending email for trigger: Unstable (Test Failures) Request made to compress build log Sending email to: sumedh.basarkod@couchbase.com balakumaran.gopal@couchbase.com Triggering a new build of savejoblogs Triggering a new build of test-executor-cleanup Triggering a new build of test-executor-cleanup-dynvm Finished: UNSTABLE